Hackers fool self-driving cars into sudden braking

‘Phantom’ stop sign on digital billboard causes Tesla to slam on the brakes

22 October 2020 - 08:23 By Denis Droppa
Hackers have managed to trick semi-autonomous cars into unsafe driving behaviour. Picture: Supplied
Hackers have managed to trick semi-autonomous cars into unsafe driving behaviour. Picture: Supplied

Israeli researchers at Ben Gurion University have tricked a semi-autonomous car into slamming on its brakes by flashing an image of a stop sign on a digital billboard.

The car was equipped with Tesla’s latest Autopilot semi-autonomous system but was unable to discern the digital image from a real stop sign, leading it to make a sudden stop that could cause an accident or a traffic jam.

“The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous,” Ben Gurion University researcher Yisroel Mirsky told Wired magazine.

He added that such a hack requires the image of a stop sign to be flashed for just a split second, and leaves behind no evidence.

“The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why.”

The research also projected images onto the road to trigger autonomous cars’ systems into doing something they shouldn’t. Phantom lines were projected onto the road, causing the Autopilot to steer the car into oncoming traffic. They also tricked the car into driving at the incorrect speed limit with a projected road sign.

The research team says these so-called phantom attacks could be carried out as an extortion technique, as an act of terrorism or for pure mischief.

“Previous methods leave forensic evidence and require complicated preparation,” Ben Gurion researcher Ben Nassi told the magazine. “Phantom attacks can be done purely remotely, and they do not require any special expertise.”

Tesla and other carmakers aren’t yet making fully robotic cars, and semi-autonomous systems like Autopilot are capable of centering the car within its lane, keeping a safe following distance, and braking for stop signs and red lights, but they still require the driver to keep their hands on the steering wheel and be ready to take control if necessary.

By some estimates, self-driving cars could account for up to a quarter of vehicle sales in less than 20 years’ time in a bid to reduce traffic congestion and make our daily commutes quicker, less stressful and much safer.

But the prospect of a car’s control being remotely taken over by someone with bad intentions is scary, which is why several studies are being conducted to figure out the possible scenarios that could confuse vehicles’ robotic minds, and prevent hacking.  

This includes the annual Defcon security convention in Las Vegas, US, which gives hackers the chance to try to break into the control units of cars and take over their driving functions.

Sponsored by carmakers, the convention seeks to discover the cyber vulnerabilities of modern vehicles, which have software controlling everything from the infotainment system to safety systems like steering, acceleration and brakes..

Manufacturers and automotive suppliers collaborate with so-called “white hat” or ethical hackers — cyber experts who help organisations identify IT security weaknesses.

At the 2013 Defcon, two security researchers hacked into car computers and took over the steering, acceleration, brakes and other functions of a 2010 Ford Escape and a 2010 Toyota Prius.

In 2015 a team wirelessly ran a Jeep Cherokee off the road by hacking through the entertainment system to its steering, brakes and transmission from a laptop.