Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

ROV-MI: Large-Scale, Accurate and Efficient Measurement of ROV Deployment

Wenqi Chen (Tsinghua University), Zhiliang Wang (Tsinghua University), Dongqi Han (Tsinghua University), Chenxin Duan (Tsinghua University), Xia Yin (Tsinghua University), Jiahai Yang (Tsinghua University), Xingang Shi (Tsinghua University)

Read More

Cross-Language Attacks

Samuel Mergendahl (MIT Lincoln Laboratory), Nathan Burow (MIT Lincoln Laboratory), Hamed Okhravi (MIT Lincoln Laboratory)

Read More

Packet-Level Open-World App Fingerprinting on Wireless Traffic

Jianfeng Li (The Hong Kong Polytechnic University), Shuohan Wu (The Hong Kong Polytechnic University), Hao Zhou (The Hong Kong Polytechnic University), Xiapu Luo (The Hong Kong Polytechnic University), Ting Wang (Penn State), Yangyang Liu (The Hong Kong Polytechnic University), Xiaobo Ma (Xi'an Jiaotong University)

Read More