Kaiming Huang (Penn State University), Yongzhe Huang (Penn State University), Mathias Payer (EPFL), Zhiyun Qian (UC Riverside), Jack Sampson (Penn State University), Gang Tan (Penn State University), Trent Jaeger (Penn State University)

Despite vast research on defenses to protect stack objects from the exploitation of memory errors, much stack data remains at risk. Historically, stack defenses focus on the protection of code pointers, such as return addresses, but emerging techniques to exploit memory errors motivate the need for practical solutions to protect stack data objects as well. However, recent approaches provide an incomplete view of security by not accounting for memory errors comprehensively and by limiting the set of objects that can be protected unnecessarily. In this paper, we present the DataGuard system that identifies which stack objects are safe statically from spatial, type, and temporal memory errors to protect those objects efficiently. DataGuard improves security through a more comprehensive and accurate safety analysis that proves a larger number of stack objects are safe from memory errors, while ensuring that no unsafe stack objects are mistakenly classified as safe. DataGuard's analysis of server programs and the SPEC CPU2006 benchmark suite shows that DataGuard improves security by: (1) ensuring that no memory safety violations are possible for any stack objects classified as safe, removing 6.3% of the stack objects previously classified safe by the Safe Stack method, and (2) blocking exploit of all 118 stack vulnerabilities in the CGC Binaries. DataGuard extends the scope of stack protection by validating as safe over 70% of the stack objects classified as unsafe by the Safe Stack method, leading to an average of 91.45% of all stack objects that can only be referenced safely. By identifying more functions with only safe stack objects, DataGuard reduces the overhead of using Clang's Safe Stack defense for protection of the SPEC CPU2006 benchmarks from 11.3% to 4.3%. Thus, DataGuard shows that a comprehensive and accurate analysis can both increase the scope of stack data protection and reduce overheads.

View More Papers

Cross-Language Attacks

Samuel Mergendahl (MIT Lincoln Laboratory), Nathan Burow (MIT Lincoln Laboratory), Hamed Okhravi (MIT Lincoln Laboratory)

Read More

Effects of Knowledge and Experience on Privacy Decision-Making in...

Zekun Cai (Penn State University), Aiping Xiong (Penn State University)

Read More

Demo #3: I Am Not Afraid of the GPS...

Ali A. Abdallah (UC Irvine), Zaher M. Kassas (UC Irvine) and Chiawei Lee (US Air Force Test Pilot School)

Read More

On Utility and Privacy in Synthetic Genomic Data

Bristena Oprisanu (UCL), Georgi Ganev (UCL & Hazy), Emiliano De Cristofaro (UCL)

Read More