Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

NC-Max: Breaking the Security-Performance Tradeoff in Nakamoto Consensus

Ren Zhang (Nervos), Dingwei Zhang (Nervos), Quake Wang (Nervos), Shichen Wu (School of Cyber Science and Technology, Shandong University), Jan Xie (Nervos), Bart Preneel (imec-COSIC, KU Leuven)

Read More

DeepSight: Mitigating Backdoor Attacks in Federated Learning Through Deep...

Phillip Rieger (Technical University of Darmstadt), Thien Duc Nguyen (Technical University of Darmstadt), Markus Miettinen (Technical University of Darmstadt), Ahmad-Reza Sadeghi (Technical University of Darmstadt)

Read More

Demo #8: Security of Camera-based Perception for Autonomous Driving...

Christopher DiPalma, Ningfei Wang, Takami Sato, and Qi Alfred Chen (UC Irvine)

Read More

EMS: History-Driven Mutation for Coverage-based Fuzzing

Chenyang Lyu (Zhejiang University), Shouling Ji (Zhejiang University), Xuhong Zhang (Zhejiang University & Zhejiang University NGICS Platform), Hong Liang (Zhejiang University), Binbin Zhao (Georgia Institute of Technology), Kangjie Lu (University of Minnesota), Raheem Beyah (Georgia Institute of Technology)

Read More