Ren Ding (Georgia Institute of Technology), Hong Hu (Georgia Institute of Technology), Wen Xu (Georgia Institute of Technology), Taesoo Kim (Georgia Institute of Technology)

Software vendors collect crash reports from end-users to assist debugging and testing of their products. However, crash reports may contain user’s private information, like names and passwords, rendering users hesitated to share the crash report with developers. We need a mechanism to protect user’s privacy from crash reports on the client-side, and meanwhile, keep sufficient information to support server-side debugging.

In this paper, we propose the DESENSITIZATION technique that generates privacy-aware and attack-preserving crash reports from crashed processes. Our tool uses lightweight methods to identify bug- and attack-related data from the memory, and removes other data to protect user’s privacy. Since the desensitized memory has more null bytes, we store crash reports in spare files to save the network bandwidth and the server-side storage. We prototype DESENSITIZATION and apply it to a large number of crashes from several real-world programs, like browser and JavaScript engine. The result shows that our DESENSITIZATION technique can eliminate 80.9% of non-zero bytes from coredumps, and 49.0% from minidumps. The desensitized crash report can be 50.5% smaller than the original size, which significantly saves resources for report submission and storage. Our DESENSITIZATION technique is a push-button solution for the privacy-aware crash report.

View More Papers

coucouArray ( [post_type] => ndss-paper [post_status] => publish [posts_per_page] => 4 [orderby] => rand [tax_query] => Array ( [0] => Array ( [taxonomy] => category [field] => id [terms] => Array ( [0] => 39 ) ) ) [post__not_in] => Array ( [0] => 5921 ) )

Poseidon: Mitigating Volumetric DDoS Attacks with Programmable Switches

Menghao Zhang (Tsinghua University), Guanyu Li (Tsinghua University), Shicheng Wang (Tsinghua University), Chang Liu (Tsinghua University), Ang Chen (Rice University), Hongxin Hu (Clemson University), Guofei Gu (Texas A&M University), Qi Li (Tsinghua University), Mingwei Xu (Tsinghua University), Jianping Wu (Tsinghua University)

Read More

When Match Fields Do Not Need to Match: Buffered...

Jiahao Cao (Tsinghua University; George Mason University), Renjie Xie (Tsinghua University), Kun Sun (George Mason University), Qi Li (Tsinghua University), Guofei Gu (Texas A&M University), Mingwei Xu (Tsinghua University)

Read More

µRAI: Securing Embedded Systems with Return Address Integrity

Naif Saleh Almakhdhub (Purdue University and King Saud University), Abraham A. Clements (Sandia National Laboratories), Saurabh Bagchi (Purdue University), Mathias Payer (EPFL)

Read More

Heterogeneous Private Information Retrieval

Hamid Mozaffari (University of Massachusetts Amherst), Amir Houmansadr (University of Massachusetts Amherst)

Read More

Privacy Starts with UI: Privacy Patterns and Designer Perspectives in UI/UX Practice

Anxhela Maloku (Technical University of Munich), Alexandra Klymenko (Technical University of Munich), Stephen Meisenbacher (Technical University of Munich), Florian Matthes (Technical University of Munich)

Vision: Profiling Human Attackers: Personality and Behavioral Patterns in Deceptive Multi-Stage CTF Challenges

Khalid Alasiri (School of Computing and Augmented Intelligence Arizona State University), Rakibul Hasan (School of Computing and Augmented Intelligence Arizona State University)

From Underground to Mainstream Marketplaces: Measuring AI-Enabled NSFW Deepfakes on Fiverr

Mohamed Moustafa Dawoud (University of California, Santa Cruz), Alejandro Cuevas (Princeton University), Ram Sundara Raman (University of California, Santa Cruz)