Chinese hackers trick Tesla into driving into oncoming traffic

Tesla Model S (Image: Daily Record)

It’s designed to make driving easier, but Tesla’s Autopilot feature may not be as safe as the firm thinks.

In a new study, researchers have shown how easy it is to trick Tesla’s Autopilot into steering the car into oncoming traffic.

Researchers from Tencent’s Keen Lab hacked the Tesla as part of a wider investigation into the security and safety of Advanced Driver Assistance Systems.

Autopilot uses a range of markers to understand where the car is on the road, including lane histories, road markings, and the size and distance of various objects.

To trick the system, the researchers placed small interference stickers on the road, before sending a Tesla Model S in Autopilot mode on its way.

Worryingly, the researchers found that the stickers caused the Tesla to make an abnormal judgement, entering into the reverse lane.

While this suggests that hackers could easily trick Tesla cars into driving into oncoming traffic, it seems that Tesla isn’t worried.

Speaking to Tencent, a spokesperson for Tesla said: “In this demonstration the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use.

“This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”