Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Effects of Knowledge and Experience on Privacy Decision-Making in...

Zekun Cai (Penn State University), Aiping Xiong (Penn State University)

Read More

SynthCT: Towards Portable Constant-Time Code

Sushant Dinesh (University of Illinois at Urbana Champaign), Grant Garrett-Grossman (University of Illinois at Urbana Champaign), Christopher W. Fletcher (University of Illinois at Urbana Champaign)

Read More

HeadStart: Efficiently Verifiable and Low-Latency Participatory Randomness Generation at...

Hsun Lee (National Taiwan University), Yuming Hsu (National Taiwan University), Jing-Jie Wang (National Taiwan University), Hao Cheng Yang (National Taiwan University), Yu-Heng Chen (National Taiwan University), Yih-Chun Hu (University of Illinois at Urbana-Champaign), Hsu-Chun Hsiao (National Taiwan University)

Read More

ATTEQ-NN: Attention-based QoE-aware Evasive Backdoor Attacks

Xueluan Gong (Wuhan University), Yanjiao Chen (Zhejiang University), Jianshuo Dong (Wuhan University), Qian Wang (Wuhan University)

Read More