Forum topic

4 posts / 0 new
Last post
Can AI be more dangerous that nuclear weapons?



This article made me think of a lot of questions I don't have answers to. If it's so dangerous, why are so many countries in a race to develop the first superintelligent AI? Is there a basis for this fear or are we just too scared of our own shadows?

No votes yet
Jet peterson

It's capable of vastly more than almost anyone knows and the rate of improvement is exponential. Given the AI Imperative, there's really only two likely courses of action for the world, even though there are four major possibilities on how to proceed. The first is to make AI development illegal all around the world—similar to chemical weapon development. However, people and companies probably would not go for it. 

No votes yet

Think about this man: AI can potentially have the ability to threaten and even surpass humans in almost all facets of science. If that ever happens then it definitely can control weapons as well. A self thinking computer is MUCH, much scarier than a weapon imo.

No votes yet

We need to be afraid of the minds who can use the weapon and not the weapons themselves. If AI comes to the point of deciding to control weapons then we are all in big trouble lol

No votes yet