Bernard Nongpoh (Université Paris Saclay), Marwan Nour (Université Paris Saclay), Michaël Marcozzi (Université Paris Saclay), Sébastien Bardin (Université Paris Saclay)

Fuzzing is an effective software testing method that discovers bugs by feeding target applications with (usually a massive amount of) automatically generated inputs. Many state-of-art fuzzers use branch coverage as a feedback metric to guide the fuzzing process. The fuzzer retains inputs for further mutation only if branch coverage is increased. However, branch coverage only provides a shallow sampling of program behaviours and hence may discard inputs that might be interesting to mutate. This work aims at taking advantage of the large body of research over defining finer-grained code coverage metrics (such as mutation coverage) and use these metrics as better proxies to select interesting inputs for mutation. We propose to make coverage-based fuzzers support most fine-grained coverage metrics out of the box (i.e., without changing fuzzer internals). We achieve this by making the test objectives defined by these metrics (such as mutants to kill) explicit as new branches in the target program. Fuzzing such a modified target is then equivalent to fuzzing the original target, but the fuzzer will also retain inputs covering the additional metrics objectives for mutation. We propose a preliminary evaluation of this novel idea using two state-of-art fuzzers, namely AFL++(3.14c) and QSYM with AFL(2.52b), on the four standard LAVA-M benchmarks. Significantly positive results are obtained on one benchmark and marginally negative ones on the three others. We discuss directions towards a strong and complete evaluation of the proposed approach and call for early feedback from the fuzzing community.

View More Papers

Effects of Knowledge and Experience on Privacy Decision-Making in...

Zekun Cai (Penn State University), Aiping Xiong (Penn State University)

Read More

FakeGuard: Exploring Haptic Response to Mitigate the Vulnerability in...

Aditya Singh Rathore (University at Buffalo, SUNY), Yijie Shen (Zhejiang University), Chenhan Xu (University at Buffalo, SUNY), Jacob Snyderman (University at Buffalo, SUNY), Jinsong Han (Zhejiang University), Fan Zhang (Zhejiang University), Zhengxiong Li (University of Colorado Denver), Feng Lin (Zhejiang University), Wenyao Xu (University at Buffalo, SUNY), Kui Ren (Zhejiang University)

Read More

MIRROR: Model Inversion for Deep LearningNetwork with High Fidelity

Shengwei An (Purdue University), Guanhong Tao (Purdue University), Qiuling Xu (Purdue University), Yingqi Liu (Purdue University), Guangyu Shen (Purdue University); Yuan Yao (Nanjing University), Jingwei Xu (Nanjing University), Xiangyu Zhang (Purdue University)

Read More

Demo #14: In-Vehicle Communication Using Named Data Networking

Zachariah Threet (Tennessee Tech), Christos Papadopoulos (University of Memphis), Proyash Poddar (Florida International University), Alex Afanasyev (Florida International University), William Lambert (Tennessee Tech), Haley Burnell (Tennessee Tech), Sheikh Ghafoor (Tennessee Tech) and Susmit Shannigrahi (Tennessee Tech)

Read More