Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Hiding My Real Self! Protecting Intellectual Property in Additive...

Sizhuang Liang (Georgia Institute of Technology), Saman Zonouz (Rutgers University), Raheem Beyah (Georgia Institute of Technology)

Read More

Physical Layer Data Manipulation Attacks on the CAN Bus

Abdullah Zubair Mohammed (Virginia Tech), Yanmao Man (University of Arizona), Ryan Gerdes (Virginia Tech), Ming Li (University of Arizona) and Z. Berkay Celik (Purdue University)

Read More

Euler: Detecting Network Lateral Movement via Scalable Temporal Graph...

Isaiah J. King (The George Washington University), H. Howie Huang (The George Washington University)

Read More

Chosen-Instruction Attack Against Commercial Code Virtualization Obfuscators

Shijia Li (College of Computer Science, NanKai University and the Tianjin Key Laboratory of Network and Data Security Technology), Chunfu Jia (College of Computer Science, NanKai University and the Tianjin Key Laboratory of Network and Data Security Technology), Pengda Qiu (College of Computer Science, NanKai University and the Tianjin Key Laboratory of Network and Data…

Read More