Researchers trick Tesla Autopilot into steering into oncoming traffic.
Researchers have devised a simple attack that might cause a Tesla to automatically steer into oncoming traffic under certain conditions.
The proof-of-concept exploit works not by hacking into the car's onboard computing system, but by using small, inconspicuous stickers that trick the Enhanced Autopilot of a Model S 75 into detecting and then following a change in the current lane.
Researchers from Tencent's Keen Security Lab recently reverse-engineered several of Tesla's automated processes to see how they reacted when environmental variables changed. One of the most striking discoveries was a way to cause Autopilot to steer into oncoming traffic. The attack worked by carefully affixing three stickers to the road. The stickers were nearly invisible to drivers, but machine-learning algorithms used by the Autopilot detected them as a line that indicated the lane was shifting to the left. As a result, Autopilot steered in that direction.
Just drive your own cars, people. It's good for you.
Amen.
ReplyDeleteThe technology is there, first it's going to be the trucks (costs less and makes fewer mistakes) then special lanes for commuter cars.
ReplyDeleteThe fact that someone can deliberately cause a self driving car to wreck is a 'so what" fact. Destroying on purpose is easy, always has been.
People cause wrecks everyday by not paying attention to what they are doing. The technology will make that part better.
With all this AI & robot technology coming out, keep in mind it doesn't have to do a better job than humans, it just needs to make fewer mistakes.
So we are to expect state and federal agencies that control the roadways to fix all the lane markings and keep them in a condition suitable for these vehicles?
ReplyDeleteA solution in search of a problem.
ReplyDeleteand you betcha tesla is really happy that this data has hit the internoob...
ReplyDelete