Should we be spooked by the ghosts in the machine?

04 December 2014 - 02:15 By Matthew Sparkes, ©The Daily Telegraph
subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now
I'LL BE BACK: Uppity artificial intelligence, such as The Terminator, may spell the end for humans
I'LL BE BACK: Uppity artificial intelligence, such as The Terminator, may spell the end for humans

Physicist Stephen Hawking has pushed forward humanity's understanding of the universe with his theories on gravitational singularities and black holes, so when he speaks it is wise to listen.

Which makes his warning that artificial intelligence could mean the end of humanity all the more concerning.

"The primitive forms of artificial intelligence we already have, have proved very useful," he told the BBC. "But I think the development of full artificial intelligence could spell the end of the human race.

"It would take off on its own, and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn't compete."

Is he right? Did the screenwriters behind The Matrix, 2001: A Space Odyssey and The Terminator all have a point? Will machines rise up and destroy us? We look at four ways that technology could destroy the human race.

Artificial intelligence

Hawking's argument is that, once a machine can truly think for itself, it could learn to make improvements to itself, slowly becoming more capable and intelligent. Eventually it could become all-powerful. We may not be able to stop this process, long and convoluted as it may be, because it could happen in the blink of an eye in human terms.

How close are we to a thinking machine? There are already designs produced by simple artificial intelligence that we don't fully understand. Some electronic circuits designed by genetic algorithms, for example, work better than those conceived by humans - and we aren't always sure why because they're too complex.

Combine this software intelligence with robot bodies and you have a science fiction film. But because every aspect of our lives is controlled by computers, this malevolent super-intelligence wouldn't need arms and legs to make life unpleasant.

You can argue that we could do artificial intelligence experiments on computers isolated from sensitive systems, but we don't seem to be able to keep human hackers in check, so why assume we can outwit a super-intelligent thinking machine? You can also argue that artificial intelligence may prove to be friendly, but if they treat us the way that we treat less intelligent creatures, then we're in a world of trouble.

Scientific disaster

There were fears that the first atomic bomb tests could ignite the atmosphere, burning everyone on Earth alive. Some believed the Large Hadron Collider would create a black hole when first switched on that would consume the planet. We got away with it, thanks only to the fact that both suggestions were hysterical nonsense. But who's to say that one day we won't attempt an experiment which has apocalyptic results?

Grey goo

A decade ago it seemed like sci-fi but we're all familiar with 3D printers now: you can buy them on Amazon. We're also creating 3D printers which can make parts for a second machine.

But imagine a machine capable of doing this which is not just microscopically small, but nanoscopically small. So small that it can stack atoms together to make molecules. This could lead to all sorts of advances in manufacturing and medicine.

But what if we get it wrong? A single typo in the source code, and instead of removing a cancerous lump in a patient, the medibots could begin churning out copies of themselves over and over until the patient is nothing but grey goo composed of billions of machines. Then the hospital, too, and the city it is in. Finally the whole planet.

The well-respected nanotechnologist Chris Phoenix discredits the idea, saying that grey goo could not happen by accident but only as the "product of a deliberate and difficult engineering process".

Climate change

By far the most likely doomsday scenario is also the least dramatic: our lack of care for the environment continues to affect the climate to the point where we cannot survive in it.

subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now