Ren Ding (Georgia Institute of Technology), Hong Hu (Georgia Institute of Technology), Wen Xu (Georgia Institute of Technology), Taesoo Kim (Georgia Institute of Technology)

Software vendors collect crash reports from end-users to assist debugging and testing of their products. However, crash reports may contain user’s private information, like names and passwords, rendering users hesitated to share the crash report with developers. We need a mechanism to protect user’s privacy from crash reports on the client-side, and meanwhile, keep sufficient information to support server-side debugging.

In this paper, we propose the DESENSITIZATION technique that generates privacy-aware and attack-preserving crash reports from crashed processes. Our tool uses lightweight methods to identify bug- and attack-related data from the memory, and removes other data to protect user’s privacy. Since the desensitized memory has more null bytes, we store crash reports in spare files to save the network bandwidth and the server-side storage. We prototype DESENSITIZATION and apply it to a large number of crashes from several real-world programs, like browser and JavaScript engine. The result shows that our DESENSITIZATION technique can eliminate 80.9% of non-zero bytes from coredumps, and 49.0% from minidumps. The desensitized crash report can be 50.5% smaller than the original size, which significantly saves resources for report submission and storage. Our DESENSITIZATION technique is a push-button solution for the privacy-aware crash report.

View More Papers

coucouArray ( [post_type] => ndss-paper [post_status] => publish [posts_per_page] => 4 [orderby] => rand [tax_query] => Array ( [0] => Array ( [taxonomy] => category [field] => id [terms] => Array ( [0] => 39 ) ) ) [post__not_in] => Array ( [0] => 5921 ) )

Melting Pot of Origins: Compromising the Intermediary Web Services...

Takuya Watanabe (NTT), Eitaro Shioji (NTT), Mitsuaki Akiyama (NTT), Tatsuya Mori (Waseda University, NICT, and RIKEN AIP)

Read More

SODA: A Generic Online Detection Framework for Smart Contracts

Ting Chen (University of Electronic Science and Technology of China), Rong Cao (University of Electronic Science and Technology of China), Ting Li (University of Electronic Science and Technology of China), Xiapu Luo (The Hong Kong Polytechnic University), Guofei Gu (Texas A&M University), Yufei Zhang (University of Electronic Science and Technology of China), Zhou Liao (University…

Read More

Hold the Door! Fingerprinting Your Car Key to Prevent...

Kyungho Joo (Korea University), Wonsuk Choi (Korea University), Dong Hoon Lee (Korea University)

Read More

SPEECHMINER: A Framework for Investigating and Measuring Speculative Execution...

Yuan Xiao (The Ohio State University), Yinqian Zhang (The Ohio State University), Radu Teodorescu (The Ohio State University)

Read More

Privacy Starts with UI: Privacy Patterns and Designer Perspectives in UI/UX Practice

Anxhela Maloku (Technical University of Munich), Alexandra Klymenko (Technical University of Munich), Stephen Meisenbacher (Technical University of Munich), Florian Matthes (Technical University of Munich)

Vision: Profiling Human Attackers: Personality and Behavioral Patterns in Deceptive Multi-Stage CTF Challenges

Khalid Alasiri (School of Computing and Augmented Intelligence Arizona State University), Rakibul Hasan (School of Computing and Augmented Intelligence Arizona State University)

From Underground to Mainstream Marketplaces: Measuring AI-Enabled NSFW Deepfakes on Fiverr

Mohamed Moustafa Dawoud (University of California, Santa Cruz), Alejandro Cuevas (Princeton University), Ram Sundara Raman (University of California, Santa Cruz)