Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

The Taming of the Stack: Isolating Stack Data from...

Kaiming Huang (Penn State University), Yongzhe Huang (Penn State University), Mathias Payer (EPFL), Zhiyun Qian (UC Riverside), Jack Sampson (Penn State University), Gang Tan (Penn State University), Trent Jaeger (Penn State University)

Read More

D-Box: DMA-enabled Compartmentalization for Embedded Applications

Alejandro Mera (Northeastern University), Yi Hui Chen (Northeastern University), Ruimin Sun (Northeastern University), Engin Kirda (Northeastern University), Long Lu (Northeastern University)

Read More

Shipping security at scale in the Chrome browser

Adriana Porter Felt (Director of Engineering for Chrome)

Read More

Demystifying Local Business Search Poisoning for Illicit Drug Promotion

Peng Wang (Indiana University Bloomington), Zilong Lin (Indiana University Bloomington), Xiaojing Liao (Indiana University Bloomington), XiaoFeng Wang (Indiana University Bloomington)

Read More