Enhancing Data Transparency
on ChatGPT

discipline

User Research, UX Design, Interaction Design, Usability Testing, Prototyping

team

Individual project

Timeline

8 weeks

tools

Figma, Illustrator


goal

In response to data privacy concerns after ChatGPT’s May 2023 breach, my goal is to improve data transparency on this platform by ensuring users have full control over their data and understand how it is being used.

constraints

The design needed to be simple and intuitive, catering to both tech-savvy users and those less familiar with data privacy concerns, or digital platforms in general.

Before & After

The original ChatGPT platform presented its privacy policy deep within the settings, which led to confusion and uncertainty among users regarding how their data was handled. User feedback revealed a clear need for improved transparency and easier control over personal data.

In response, I redesigned the registration flow that includes:

  • Educating user how ChatGPT handle their data.
  • Empowering them on the control over their security and privacy
  • A user-friendly design with clear, simplified language to ensure it’s easy for anyone to understand.
Original app

πŸ‘Ž ChatGPT's privacy policy is accessible only on their website but not within the chat interface.
‍
πŸ‘Ž Limit communication about policy updates, such as pop-up notifications within the app.
‍
πŸ‘Ž Limit/No options for users to control the data collection and usage preferences within the chat interface.

redesign app

βœ… Simplified, user-friendly access to the privacy policy during onboarding.
‍
βœ… Pop-up windows educating how data is collected and used.

βœ… Selectable boxes are utilized so users can manage their data privacy.

Problems & Solutions

One of the biggest challenges was balancing user empowerment with ease of use. Many users weren’t aware of how their data was being processed, and complex data policies discouraged them from engaging with privacy settings. Additionally, hiding privacy options within the settings made it difficult for users to find and adjust their data preferences.

Solutions:
  • Pop-up windows during registration to clearly explain data usage, eliminating confusion and fostering trust.
  • Data control toggles that allow users to easily opt in or out of data sharing.
  • Accessible privacy policy, which is displayed upfront during sign-up, so users don’t have to search for it later.
These changes significantly improved the user experience by making data control intuitive and transparent.

Exploration

My design process followed the Double Diamond framework.

During the Discover and Define phase, I conducted a literature review, competitor analysis (Google, Microsoft, Facebook), and user interviews, revealing that most users lacked awareness of data processing and wanted more control. Using these insights, I created personas representing various age groups.
‍

Problem Statement:‍
β€œHow might we enhance data integrity on ChatGPT to ensure users understand and control their data, thereby building trust in the platform?”

I developed low-fidelity wireframes that featured pop-up notifications and opt-in toggles.

Initial user feedback indicated that while the wireframes were a good start, more emphasis was needed on simplifying the language and increasing the visibility of privacy settings.
my initial design

❌ No back button
❌ Complicated and long text
❌ Overly compact layout
❌ Low visual hierachy

final design

βœ… Added back button
βœ… Simplify text
βœ… Hover effect makes it more interactive
βœ… Improved visual hierarchy

my initial design

❌ Complicated and long text
❌ Overly compact layout
❌ Low visual hierachy

final design

βœ… Added back button
βœ… Simplify text Β 
βœ… Interactive hover effect
βœ… Improved visual hierarchy
βœ… Progression bar
βœ… Simplify wording

Final Design

The final design incorporates:

  1. Pop-up windows that inform users about how their data is handled during registration.
  2. Data control toggles that allow users to easily opt in or out of data sharing
  3. A simplified privacy policy, accessible during onboarding, that uses straightforward language and is broken into easily digestible sections.
These features ensure that users are fully informed and in control of their data from the moment they start using the platform.

I built an interactive prototype in Figma and tested it with participants who regularly use ChatGPT. The testing focused on how easily users could understand and control their data with the new features.

Key Results:
  • 85% of participants felt they better understood how ChatGPT handles their data.
  • 90% of users felt more in control of their privacy, thanks to the new toggles.
  • 95% found the privacy policy easier to access, reducing confusion and boosting confidence in the platform.
View prototype
Showing user how the platform handle their data
User able to protect their data by opting in options
User able to read through the corporate privacy policy and agree to it

Reflection

This project taught me the importance of building trust through transparency. By simplifying the way users engage with privacy settings and making data-sharing options more accessible, I was able to create a more user-friendly and trustworthy platform. The positive feedback from users validated that simple, clear design can make a significant impact on user satisfaction and data control.

In the future, I plan to:
  • Expand user testing to include a broader demographic, ensuring the design works for users of all backgrounds and tech familiarity.
  • Add more interactive privacy settings, such as sliders for data sharing preferences and real-time feedback on changes.
  • Create educational content to help users better understand privacy policies, with FAQs and interactive guides available directly on the platform.