Jairo Giraldo (University of Utah), Alvaro Cardenas (UC Santa Cruz), Murat Kantarcioglu (UT Dallas), Jonathan Katz (George Mason University)

Differential Privacy has emerged in the last decade as a powerful tool to protect sensitive information. Similarly, the last decade has seen a growing interest in adversarial classification, where an attacker knows a classifier is trying to detect anomalies and the adversary attempts to design examples meant to mislead this classification.

Differential privacy and adversarial classification have been studied separately in the past. In this paper, we study the problem of how a strategic attacker can leverage differential privacy to inject false data in a system, and then we propose countermeasures against these novel attacks. We show the impact of our attacks and defenses in a real-world traffic estimation system and in a smart metering system.

View More Papers

SurfingAttack: Interactive Hidden Attack on Voice Assistants Using Ultrasonic...

Qiben Yan (Michigan State University), Kehai Liu (Chinese Academy of Sciences), Qin Zhou (University of Nebraska-Lincoln), Hanqing Guo (Michigan State University), Ning Zhang (Washington University in St. Louis)

Read More

FUSE: Finding File Upload Bugs via Penetration Testing

Taekjin Lee (KAIST, ETRI), Seongil Wi (KAIST), Suyoung Lee (KAIST), Sooel Son (KAIST)

Read More

Finding Safety in Numbers with Secure Allegation Escrows

Venkat Arun (Massachusetts Institute of Technology), Aniket Kate (Purdue University), Deepak Garg (Max Planck Institute for Software Systems), Peter Druschel (Max Planck Institute for Software Systems), Bobby Bhattacharjee (University of Maryland)

Read More

When Match Fields Do Not Need to Match: Buffered...

Jiahao Cao (Tsinghua University; George Mason University), Renjie Xie (Tsinghua University), Kun Sun (George Mason University), Qi Li (Tsinghua University), Guofei Gu (Texas A&M University), Mingwei Xu (Tsinghua University)

Read More