Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Demo #12: Too Afraid to Drive: Systematic Discovery of...

Ziwen Wan (UC Irvine), Junjie Shen (UC Irvine), Jalen Chuang (UC Irvine), Xin Xia (UCLA), Joshua Garcia (UC Irvine), Jiaqi Ma (UCLA) and Qi Alfred Chen (UC Irvine)

Read More

Demo #15: Remote Adversarial Attack on Automated Lane Centering

Yulong Cao (University of Michigan), Yanan Guo (University of Pittsburgh), Takami Sato (UC Irvine), Qi Alfred Chen (UC Irvine), Z. Morley Mao (University of Michigan) and Yueqiang Cheng (NIO)

Read More

P4DDPI: Securing P4-Programmable Data Plane Networks via DNS Deep...

Ali AlSabeh (University of South Carolina), Elie Kfoury (University of South Carolina), Jorge Crichigno (University of South Carolina) and Elias Bou-Harb (University of Texas at San Antonio)

Read More

Time-Based CAN Intrusion Detection Benchmark

Deborah Blevins (University of Kentucky), Pablo Moriano, Robert Bridges, Miki Verma, Michael Iannacone, and Samuel Hollifield (Oak Ridge National Laboratory)

Read More