science

Elon Musk WARNING: Artificial Intelligence could be an ‘IMMORTAL DICTATOR’


The South African-born billionaire has warned humans not to get in the way of AI’s goals, and by creating machines which are superior to us, we are paving the way to our own downfall. Ultimately, Mr Musk warns, if a machine gains super-intelligence, then it could easily govern the world and as the saying goes, absolute power corrupts absolutely. Speaking to American filmmaker Chris Paine for a new documentary titled ‘Do You Trust This Computer?’, the SpaceX and Tesla chief said: “The least scary future I can think of is one where we have at least democratised AI because if one company or small group of people manages to develop godlike digital super-intelligence, they could take over the world.

“At least when there’s an evil dictator, that human is going to die. But for an AI, there would be no death. It would live forever.

“And then you’d have an immortal dictator from which we can never escape.”

Mr Musk cited Google’s DeepMind as an example of how quickly computers can gain knowledge.

In 2016, Google’s AI program, known as DeepMind, showed its makers it was capable of learning independently, teaching itself to beat the world champion in a game of Go!.

It is the sheer determination and speed in which programs such as DeepMind can learn that humanity may one day get in its way, Mr Musk warned.

He said: “The DeepMind system can win at any game. It can already beat all the original Atari games.

“It is superhuman; it plays all the games at super speed in less than a minute.

“AI doesn’t have to be evil to destroy humanity.

“If AI has a goal and humanity just happens to be in the way, it will destroy humanity as a matter of course without even thinking about it. No hard feelings.

“It’s just like, if we’re building a road and an anthill just happens to be in the way, we don’t hate ants, we’re just building a road, and so, goodbye anthill.”





READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.