Xinyao Ma, Ambarish Aniruddha Gurjar, Anesu Christopher Chaora, Tatiana R Ringenberg, L. Jean Camp (Luddy School of Informatics, Computing, and Engineering, Indiana University Bloomington)

This study delves into the crucial role of developers in identifying privacy sensitive information in code. The context informs the research of diverse global data protection regulations, such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). It specifically investigates programmers’ ability to discern the sensitivity level of data processing in code, a task of growing importance given the increasing legislative demands for data privacy.

We conducted an online card-sorting experiment to explore how the participating programmers across a range of expertise perceive the sensitivity of variable names in code snippets. Our study evaluates the accuracy, feasibility, and reliability of our participating programmers in determining what constitutes a ’sensitive’ variable. We further evaluate if there is a consensus among programmers, how their level of security knowledge influences any consensus, and whether any consensus or impact of expertise is consistent across different categories of variables. Our findings reveal a lack of consistency among participants regarding the sensitivity of processing different types of data, as indicated by snippets of code with distinct variable names. There remains a significant divergence in opinions, particularly among those with more technical expertise. As technical expertise increases, consensus decreases across the various categories of sensitive data. This study not only sheds light on the current state of programmers’ privacy awareness but also motivates the need for developing better industry practices and tools for automatically identifying sensitive data in code.

View More Papers

HEIR: A Unified Representation for Cross-Scheme Compilation of Fully...

Song Bian (Beihang University), Zian Zhao (Beihang University), Zhou Zhang (Beihang University), Ran Mao (Beihang University), Kohei Suenaga (Kyoto University), Yier Jin (University of Science and Technology of China), Zhenyu Guan (Beihang University), Jianwei Liu (Beihang University)

Read More

Augmented Reality’s Potential for Identifying and Mitigating Home Privacy...

Stefany Cruz (Northwestern University), Logan Danek (Northwestern University), Shinan Liu (University of Chicago), Christopher Kraemer (Georgia Institute of Technology), Zixin Wang (Zhejiang University), Nick Feamster (University of Chicago), Danny Yuxing Huang (New York University), Yaxing Yao (University of Maryland), Josiah Hester (Georgia Institute of Technology)

Read More

On-demand RFID: Improving Privacy, Security, and User Trust in...

Youngwook Do (JPMorganChase and Georgia Institute of Technology), Tingyu Cheng (Georgia Institute of Technology and University of Notre Dame), Yuxi Wu (Georgia Institute of Technology and Northeastern University), HyunJoo Oh(Georgia Institute of Technology), Daniel J. Wilson (Northeastern University), Gregory D. Abowd (Northeastern University), Sauvik Das (Carnegie Mellon University)

Read More

SigmaDiff: Semantics-Aware Deep Graph Matching for Pseudocode Diffing

Lian Gao (University of California Riverside), Yu Qu (University of California Riverside), Sheng Yu (University of California, Riverside & Deepbits Technology Inc.), Yue Duan (Singapore Management University), Heng Yin (University of California, Riverside & Deepbits Technology Inc.)

Read More