RECENT NEWS

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

June 2023: A Significant Percentage of iOS mobile apps seem to have privacy labels that are inconsistent with their privacy policies

We developed classifiers to predict privacy labels based on the text of privacy policies. Our study suggests that our technique achieves an F1 score of over 98%. We then proceeded to analyze a little over 350,000 apps in the iOS app store. Results suggest that discrepancies are quite common. In particular we find that nearly 30% of mobile apps seem to have privacy labels that disclose practices not disclosed in their privacy policies. In our view, while it is easy to blame app developers/publishers, app stores, as the more sophisticated and powerful actors, also bear some responsibility and should do a better job helping app publishers create more accurate privacy labels and policies. This includes in particular creating stricter standards for third party libraries, which, based on our earlier research, are often the source of inaccuracies.

Askahth Jain, David Rodriguez, Jose del Alamo, and Norman Sadeh, "ATLAS: Automatically Detecting Discrepancies between Privacy Policies and Privacy Labels", 2023 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW), Amsterdam, June 2023.

2023

March 2023: Aerin (Shikhun) Zhang defends her dissertation. Congrats, Aerin!

Aerin Zhang, "Undertanding People's Diverse Privacy Attitudes: Notification, Control and Regulatory Implications", Language Technologies Institute PhD Program, School of Computer Science, Carnegie Mellon University. March 17, 2023

2023

February 2023: Do Privacy Labels Answer People's Privacy Questions?

Back in 2013, we reported on research that showed how privacy labels could help app store users make better informed decisions. Our paper was published at CHI. In December 2020 Apple introduced privacy labels in its app store, crediting our earlier research for influencing their decision. Six months later the Google Play Store followed with its privacy (or "safety") labels. A year ago, we published a study at PoPETS, showing that unfortunately in their current form iOS privacy labels fall short. In a new paper published at USEC this month, we estimate to what extent iOS privacy labels (in their current form) answer those privacy questions people actually care about. Our study suggests that the answer might be "less than 50% of the time." Follow the link below to read our USEC2023 paper "Do Privacy Labels Answer People's Privacy Questions"".

2023

October 2022: Keynote at ACM CIKM Workshop on Privacy Algorithms in Systems

Just gave a keynote at the 1st International Workshop on "Privacy Algorithms in Systems."  My talk focused on "Privacy in the Age of AI and the Internet of Things".

2022

July 2022: iOS Privacy Labels Miss the Mark - Paper Presentation at PoPETS'2022

Here's a CyLab press release summarizing the result of a study conducted with my PhD student, Aerin Zhang,  Yuanyuan Feng, Yaxing Yao and Lorrie Cranor on the usability of iOS Privacy Labels in their current form.

This research was presented at PoPETS 2022 earlier this month.

The full article is available here

This research is taking place under the umbrella of our Usable Privacy Policy Project

2022

June 2022: Aerin Zhang presents our work on the acceptance of COVID-19 vaccination mandates and certificates at ACM's FAccT2022 Conference

Stop the Spread: A Contextual Integrity Perspective on the Appropriateness of COVID-19 Vaccination Certificates

Shikun Zhang, Yan Shvartzshnaider, Yuanyuan Feng, Helen Nissenbaum, and Norman Sadeh

We present an empirical study exploring how privacy influences the acceptance of vaccination certificate (VC) deployments across different realistic usage scenarios. The study employed the privacy framework of Contextual Integrity, which has been shown to be particularly effective to capture people's privacy expectations across different contexts. We use a vignette methodology, where we selectively manipulate salient contextual parameters understood to potentially impact people's attitudes towards VCs. We surveyed 890 participants from a demographically-stratified sample of the US population to gauge the acceptance and overall attitudes towards possible VC deployments to enforce vaccination mandates and the different information flows VCs might entail. Analysis of results collected at part of this study are used to derive general normative observations about different possible VC practices and to provide guidance for the possible VC deployments in different contexts.

Here's a video of Aerin's presentation

2022

June 2022: Our Research in the Context of "Livehoods" Honored with ICWSM 2022 Test of Time Award

Honored to see our work on Livehoods, mining public social media data to understand the dynamics of cities, selected for the test of time award at AAAI's 16th International Conference on Web and Social Media.

Here's CMU's School of Computer Science press release.

2022

June 2022: Our work on a tool to help iOS app developers create more accurate privacy labels presented at IWPE'2022 and PEPR'2022

Helping Mobile App Developers Create Accurate Privacy Labels. by Jack Gardner, Akshath Jain, Yuanyuan Feng, Kayla Reiman, Zhi Lin and Norman Sadeh. In this work we discuss the design and evaluation of a tool to help iOS developers generate privacy labels. The tool combines static code analysis to identify likely data collection and use practices with interactive functionality designed to prompt developers to elucidate analysis results and carefully reflect on their applications’ data practices. We conducted semi-structured interviews with iOS developers as they used an initial version of the tool. We discuss how these results motivated us to develop an enhanced software tool, Privacy Label Wiz, that more closely resembles interactions developers reported to be most useful in our semi-structured interviews. We present findings from our interviews and the enhanced tool motivated by our study. We also outline future directions for software tools to better assist developers communicating their mobile app’s data practices to different audiences.

2022

May 2022: Keynote at Cyburgh 2022

Gave a keynote at Cyburgh 2022 on "Privacy as a New Tech Sector".

2022

April 2022: Justin Cranshaw defends his dissertation. Congrats, Justin!

Justin Cranshaw, "Depicting Places in Information Systems: Closing the Gap Between Representation and Experience", PhD Dissertation, School of Computer Science technical report CMU-ISR-22-106, May 2022.

2022

April 2022, Panelist, Privacy Symposium

Panelist at Privacy Symposium in Venice.

My talk focused on "Making Privacy Humanly Tractable".

2022

March 11, 2022: Recent interview on Midwest Moxie (NPR) with Pulitzer Prize-winning journalist, Kathleen Gallagher

Interviewed by Pulitzer-winner Kathleen Gallagher as she launches her "Midwest Moxie' podcast, which features interviews with entrepreneurs.

2022

February 2022: Keynote at ICISSP 2022 (International Conference on Information Systems Security and Privacy)

My ICISSP keynote focused on "Why Usability Has Become Privacy's Biggest Challenge and What We Can Do About It".

See below for a copy of my slides.

2022

January 2022: Honored to have our paper selected for FPF "Privacy Papers for Policymakers" Award

Our paper about people’s perceptions of advanced video analytics has been selected to receive the prestigious Future of Privacy Forum’s annual Privacy Papers for Policymakers Award.

The paper titled “‘Did you know this camera tracks your mood? Understanding Privacy Expectations and Preferences in the Age of Video Analytics,” was originally published and presented at the 2021 Privacy Enhancing Technologies Symposium.

Click on the link below for the CyLab/S3D press release.

2022

December 2021: Keynote at BIGS 2021 on "Security and Privacy: Reconciling the Strengths and Limitations of Human and Artificial Intelligence"

Reflections on the limitations and strengths of human and artificial intelligence and how our work in user-oriented security and privacy aims to develop practical solutions where both forms of intelligence are deployed to best complement one another. The presentation builds on our anti-phishing work, including work at Wombat Security Technologies, as well as research at CMU in the context of the Usable Privacy Policy Project and the Personalized Privacy Assistant Project.

2021

October 2021: Grateful to receive a "Google Privacy-Related Faculty Research Award" for our work on Mobile App Privacy Nutrition Labels

Following our 2013 proposing the adoption of mobile app privacy labels, both Apple (Fall 2020) and Google (Spring 2021) have now announced the introduction of such labels in their app stores - see our 2013 CHI article with Patrick Gage Kelley and Lorrie Cranor here.

It appears however that in the form in which they have been introduced, these labels are not fully delivering on their promises - with anecdotal evidence suggesting that both developers and end-users struggle to fully understand what these labels mean and how to use them. As part of this project, we will be conducting an in-depth study of the usability of current iOS mobile app privacy labels (joint work with my PhD student, Aerin Zhang, and with Lorrie Cranor) and will also be developing tools to help mobile app developers create more accurate privacy labels.

2021

October 2021: First cohort of students complete our Privacy Engineering Certificate Program

In response to increased demand from industry, we launched a new certificate program in privacy engineering earlier this Fall semester. The program is remote but involves live lectures and interactions with our faculty as well as both individual and group exercises. The program is delivered over 4 consecutive weekends. We will be running 4 cohorts per year.

Survey of the students who completed the program indicate they all felt they learned a ton and really enjoyed the live interactions and group exercises.

2021

September 2021: Daniel Smullen defends his dissertation. Congrats, Daniel!

D. Smullen, "Informing the Design and Refinement of Privacy and Security Controls", Ph.D. Thesis, Software Engineering PhD Program, Technical Report CMU-ISR-21-111, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA. September 2021.

2021

August 2021: Breaking Down Walls of Text: How Can NLP Benefit Consumer Privacy? (ACL/IJNLP presentation)

Abhilasha Ravichander presents our research at the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL/IJCNLP). Here's the abstract:

Privacy plays a crucial role in preserving democratic ideals and personal autonomy. The dominant legal approach to privacy in many jurisdictions is the “Notice and Choice” paradigm, where privacy policies are the primary instrument used to convey information to users. However, privacy policies are long and complex documents that are difficult for users to read and comprehend. We discuss how language technologies can play an important role in addressing this information gap, reporting on initial progress towards helping three specific categories of stakeholders take advantage of digital privacy policies: consumers, enterprises, and regulators. Our goal is to provide a roadmap for the development and use of language technologies to empower users to reclaim control over their privacy, limit privacy harms, and rally research efforts from the community towards addressing an issue with large social impact. We highlight many remaining opportunities to develop language technologies that are more precise or nuanced in the way in which they use the text of privacy policies.

2021

August 2021: A few quotes in recent Pittsburgh Post Gazette article on privacy in the Internet of Things

A few quotes related to our work on a privacy infrastructure for the Internet of Things. Our infrastructure is now hosting descriptions of about 150,000 IoT data collection systems and devices.

2021

July 2021: Peter Story defends his dissertation. Congrats, Peter!

P. Story, "Design and Evaluation of Security and Privacy Nudges: From Protection Motivation Theory to Implementation Intentions", CMU-ISR-21-107, School of Computer Science, Carnegie Mellon University. August 2021.

2021

July 2021: Managing Potentially Intrusive Practices in the Browser

Presentation of our research by my PhD student, Daniel Smullen, at PoPETS2021:

"Browser users encounter a broad array of potentially intrusive practices: from behavioral profiling, to crypto-mining, fingerprinting, and more. We study people’s perception, awareness, understanding, and preferences to opt out of those practices..."

2021

July 2021: Misconceptions plague security and privacy tools: CyLab article on research conducted with my PhD student, Peter Story

People hold a myriad of misconceptions about the tools meant to help them protect their privacy and security. Here is a recent CyLab article on research that was presented by Peter Story at this week’s Privacy Enhancing Technologies Symposium.

Peter Story, Daniel Smullen, Yaxing Yao, Alessandro Acquisti, Lorrie Faith Cranor, Norman Sadeh, and Florian Schaub, Awareness, Adoption, and Misconceptions of Web Privacy Tools. Proceedings on Privacy Enhancing Technologies Symposium (PoPETS 2021), 3, Jul 2021

2021

July 2021: Did you know this camera tracks your mood?


S Zhang, Y Feng, L Bauer, LF Cranor, A Das, and N Sadeh, “Did you know this camera tracks your mood?”: Understanding Privacy Expectations and Preferences in the Age of Video Analytics Proceedings on Privacy Enhancing Technologies, 2, 1, Apr 2021.

Cameras are everywhere, and are increasingly coupled with video analytics software that can identify our face, track our mood, recognize what we are doing, and more. We present the results of a 10-day in-situ study designed to understand how people feel aboutt hese capabilities, looking both at the extent to which they expect to encounter them as part of their everyday activities and at how comfortable they are with the presence of such technologies across a range of realistics cenarios. Results indicate that while some widespread deployments are expected by many (e.g., surveillance in public spaces), others are not, with some making people feel particularly uncomfortable. Our results further show that individuals’ privacy preferences and expectations are complicated and vary with a number of factors such as the purpose for which footage is captured and analyzed, the particular venue where it is captured, and whom it is shared with. Finally, we discuss the implications of people’s rich and diverse preferences on opt-in or opt-out rights for the collection and use (including sharing) of data associated with these video analytics scenarios as mandated by regulations. Because of the user burden associated with the large number of privacy decisions people could be faced with, we discuss how new types of privacy assistants could possibly be configured to help people manage these decisions.

2021

May 2021: CyLab press release on our study of the "Design Space for Privacy Choices" which was presented at CHI'2021.

Y Feng, Y Yao, N Sadeh, "A Design Space for Privacy Choices: Towards Meaningful Privacy Control in the Internet of Things." Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, May 2021.

“Notice and choice” is the predominant approach for data privacy protection today. There is considerable user-centered research on providing effective privacy notices but not enough guidance on designing privacy choices. Recent data privacy regulations worldwide established new requirements for privacy choices, but system practitioners struggle to implement legally compliant privacy choices that also provide users meaningful privacy control. We construct a design space for privacy choices based on a user-centered analysis of how people exercise privacy choices in real-world systems. This work contributes a conceptual framework that considers privacy choice as a user-centered process as well as a taxonomy for practitioners to design meaningful privacy choices in their systems. We also present a use case of how we leverage the design space to finalize the design decisions for a real-world privacy choice platform, the Internet of Things (IoT) Assistant, to provide meaningful privacy control in the IoT.

2021