Soroush Karami (University of Illinois at Chicago), Panagiotis Ilia (University of Illinois at Chicago), Konstantinos Solomos (University of Illinois at Chicago), Jason Polakis (University of Illinois at Chicago)

With users becoming increasingly privacy-aware and browser vendors incorporating anti-tracking mechanisms, browser fingerprinting has garnered significant attention. Accordingly, prior work has proposed techniques for identifying browser extensions and using them as part of a device's fingerprint. While previous studies have demonstrated how extensions can be detected through their web accessible resources, there exists a significant gap regarding techniques that indirectly detect extensions through behavioral artifacts. In fact, no prior study has demonstrated that this can be done in an automated fashion. In this paper, we bridge this gap by presenting the first fully automated creation and detection of behavior-based extension fingerprints. We also introduce two novel fingerprinting techniques that monitor extensions' communication patterns, namely outgoing HTTP requests and intra-browser message exchanges. These techniques comprise the core of Carnus, a modular system for the static and dynamic analysis of extensions, which we use to create the largest set of extension fingerprints to date. We leverage our dataset of 29,428 detectable extensions to conduct a comprehensive investigation of extension fingerprinting in realistic settings and demonstrate the practicality of our attack. Our experimental evaluation against a state-of-the-art countermeasure confirms the robustness of our techniques as 87.92% of our behavior-based fingerprints remain effective.

Subsequently, we aim to explore the true extent of the privacy threat that extension fingerprinting poses to users, and present a novel study on the feasibility of inference attacks that reveal private and sensitive user information based on the functionality and nature of their extensions. We first collect over 1.44 million public user reviews of our detectable extensions, which provide a unique macroscopic view of the browser extension ecosystem and enable a more precise evaluation of the discriminatory power of extensions as well as a new deanonymization vector. We also automatically categorize extensions based on the developers' descriptions and identify those that can lead to the inference of personal data (religion, medical issues, etc.). Overall, our research sheds light on previously unexplored dimensions of the privacy threats of extension fingerprinting and highlights the need for more effective countermeasures that can prevent our attacks.

View More Papers

Proof of Storage-Time: Efficiently Checking Continuous Data Availability

Giuseppe Ateniese (Stevens Institute of Technology), Long Chen (New Jersey Institute of Technology), Mohammard Etemad (Stevens Institute of Technology), Qiang Tang (New Jersey Institute of Technology)

Read More

HFL: Hybrid Fuzzing on the Linux Kernel

Kyungtae Kim (Purdue University), Dae R. Jeong (KAIST), Chung Hwan Kim (NEC Labs America), Yeongjin Jang (Oregon State University), Insik Shin (KAIST), Byoungyoung Lee (Seoul National University)

Read More

Precisely Characterizing Security Impact in a Flood of Patches...

Qiushi Wu (University of Minnesota), Yang He (University of Minnesota), Stephen McCamant (University of Minnesota), Kangjie Lu (University of Minnesota)

Read More

Towards Plausible Graph Anonymization

Yang Zhang (CISPA Helmholtz Center for Information Security), Mathias Humbert (armasuisse Science and Technology), Bartlomiej Surma (CISPA Helmholtz Center for Information Security), Praveen Manoharan (CISPA Helmholtz Center for Information Security), Jilles Vreeken (CISPA Helmholtz Center for Information Security), Michael Backes (CISPA Helmholtz Center for Information Security)

Read More