Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Analyzing and Creating Malicious URLs: A Comparative Study on...

Vincent Drury (IT-Security Research Group, RWTH Aachen University), Rene Roepke (Learning Technologies Research Group, RWTH Aachen University), Ulrik Schroeder (Learning Technologies Research Group, RWTH Aachen University), Ulrike Meyer (IT-Security Research Group, RWTH Aachen University)

Read More

30 Years into Scientific Binary Decompilation: What We Have...

Dr. Ruoyu (Fish) Wang, Assistant Professor at Arizona State University

Read More

DRAWN APART: A Device Identification Technique based on Remote...

Tomer Laor (Ben-Gurion Univ. of the Negev), Naif Mehanna and Antonin Durey (Univ. Lille / Inria), Vitaly Dyadyuk (Ben-Gurion Univ. of the Negev), Pierre Laperdrix (CNRS, Univ. Lille, Inria Lille), Clémentine Maurice (CNRS), Yossi Oren (Ben-Gurion Univ. of the Negev), Romain Rouvoy (Univ. Lille / Inria / IUF), Walter Rudametkin (Univ. Lille / Inria), Yuval…

Read More