Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Demo #9: Dynamic Time Warping as a Tool for...

Mars Rayno (Colorado State University) and Jeremy Daily (Colorado State University)

Read More

Demo #11: Understanding the Effects of Paint Colors on...

Shaik Sabiha (University at Buffalo), Keyan Guo (University at Buffalo), Foad Hajiaghajani (University at Buffalo), Chunming Qiao (University at Buffalo), Hongxin Hu (University at Buffalo) and Ziming Zhao (University at Buffalo)

Read More

WeepingCAN: A Stealthy CAN Bus-off Attack

Gedare Bloom (University of Colorado Colorado Springs) Best Paper Award Winner ($300 cash prize)!

Read More

Denial-of-Service Attacks on C-V2X Networks

Natasa Trkulja, David Starobinski (Boston University), and Randall Berry (Northwestern University)

Read More