Lauren Biernacki

Lauren Biernacki

(she/her/hers)

University of Michigan

Computer Architecture, Security, Privacy

Lauren Biernacki is a Ph.D. candidate in Computer Science and Engineering (CSE) at the University of Michigan, advised by Professor Todd Austin. Her research is in computer architecture and security, with her dissertation work focusing on integrating integrity and confidentiality protections in modern processors. Lauren is a passionate educator and has received awards for her teaching and service efforts, including recognition for developing a first-year graduate course to help students from marginalized backgrounds gain equal footing in the Michigan Ph.D. program. Lauren received her bachelor's degree in CSE from the University of Connecticut in 2017, a master's degree in CSE from the University of Michigan in 2019, and is a Michigan Rackham Merit Fellow.

Sequestered Encryption: A Hardware Technique for Comprehensive Data Privacy

Data breaches that penetrate web-facing servers and exfiltrate sensitive user data have become pervasive. Insulating these systems from attack is seemingly impossible due to the ubiquity of software vulnerabilities within cloud applications. It is simply insurmountable to adequately address all such vulnerabilities, and therefore imprudent to rely on software applications to protect user data. Rather, the ideal systems solution upholds data confidentiality, even in the presence of vulnerable or compromised software. Homomorphic encryption (HE) provides these capabilities, but its limited expressiveness and significant runtime overheads have inhibited its adoption. In this work, we explore how trusted hardware can be leveraged to provide data confidentiality in the presence of vulnerable software while achieving practical performance overheads. We present Sequestered Encryption (SE)--a hardware technique for data privacy that sequesters sensitive plaintext data into a small hardware root of trust and encrypts this data in all external microarchitectural structures, thereby rendering secret values inaccessible to software. With optimizations, SE achieves <1.3x performance slowdowns (geomean) compared to native execution, demonstrating that architectural approaches can emerge as data privacy solutions that possess zero trust in software while being dynamic, expressive, and performant.