Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

SynthCT: Towards Portable Constant-Time Code

Sushant Dinesh (University of Illinois at Urbana Champaign), Grant Garrett-Grossman (University of Illinois at Urbana Champaign), Christopher W. Fletcher (University of Illinois at Urbana Champaign)

Read More

Building the VPNalyzer System

Reethika Ramesh (University of Michigan), Leonid Evdokimov (Independent), Diwen Xue, Roya Ensafi (University of Michigan)

Read More

WIP: Infrastructure-Aided Defense for Autonomous Driving Systems: Opportunities and...

Yunpeng Luo (UC Irvine), Ningfei Wang (UC Irvine), Bo Yu (PerceptIn), Shaoshan Liu (PerceptIn) and Qi Alfred Chen (UC Irvine)

Read More

insecure:// Vulnerability Analysis of URI Scheme Handling in Android...

Abdulla Aldoseri (University of Birmingham) and David Oswald (University of Birmingham)

Read More