Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

(Short) Spoofing Mobileye 630’s Video Camera Using a Projector

Ben Nassi, Dudi Nassi, Raz Ben Netanel and Yuval Elovici (Ben-Gurion University of the Negev)

Read More

Effects of Knowledge and Experience on Privacy Decision-Making in...

Zekun Cai (Penn State University), Aiping Xiong (Penn State University)

Read More

Hiding My Real Self! Protecting Intellectual Property in Additive...

Sizhuang Liang (Georgia Institute of Technology), Saman Zonouz (Rutgers University), Raheem Beyah (Georgia Institute of Technology)

Read More

FedCRI: Federated Mobile Cyber-Risk Intelligence

Hossein Fereidooni (Technical University of Darmstadt), Alexandra Dmitrienko (University of Wuerzburg), Phillip Rieger (Technical University of Darmstadt), Markus Miettinen (Technical University of Darmstadt), Ahmad-Reza Sadeghi (Technical University of Darmstadt), Felix Madlener (KOBIL)

Read More