KareBare

Your Digital Social Media Companion

Application Design. UI/UX. User Research. Usability Testing

Role

Product Designer


Social media plays a pivotal role in the lives of young teenagers by offering avenues for virtual connections with friends, personal expression & education, and entertainment. However, alongside its benefits come significant challenges: exposure to bad actors, screen addiction, & cyberbullying, all of which can have profound negative impacts on mental health and personal well-being. 


While previous solutions have attempted to address these issues through education and technological interventions, they often fall short in providing comprehensive support and guidance for young users. Enter KareBare: a virtual A.I. chat bot designed to revolutionize the way young users navigate social media.


By offering personalized chat interactions, screen time alerts, and resources for dealing with online risks, KareBare empowers users to engage safely online by fostering a healthier & more positive digital experience. Our design aims to help young users able to maintain a healthy relationship with social media.

Team

2 Researchers, 3 Designers,

2 Research Consultants



Timeline

4 months



Skills & Tools

Figma, Zoom, Notion, Visual Design, User Research, Prototyping



PROJECT OVERVIEW

The Solution

Teenagers

Social Media

Parents

Government

Awareness of Online Risks

Parents struggle to teach their kids how to be safe online due to differences in perspectives on social media. Educational programs & schools fall behind on adapting to new social media platforms.

Limited Moderation on User Content

There are no incentives for social media platforms to harsher content moderation. Holding bad actors accountable for their posts and criminalizing predatory algorithms. Anonymity & privacy concerns enable bad actors on platforms to continue creating hostile environments

Regulating Social Media Platforms

Major limitations of legal intervention from the U.S. government to protect teens on social media. Social media legislator alleged that content moderation is protected under Section 230: social media platforms are not liable as publishers or speakers for what their users post on their platform.

Regulating Social Media Platforms

Major limitations of legal intervention from the U.S. government to protect teens on social media. Social media legislator alleged that content moderation is protected under Section 230: social media platforms are not liable as publishers or speakers for what their users post on their platform.

Difficulty Coping With Online Harms

Teens are highly susceptible to harmful content on social media (i.e. graphic content, harassment, misinformation, exposure to bad actors, etc.). Young users are ill-equipped to process trauma from these negative experiences forced to learn on their own terms.

Literature Review

Literature Review

Survey Studies

14 Peer-Reviewed Articles

Responses from Young Adults

Multiple Student & Parent Interviews

Survey Study

Semi-Structured Interviews

User Research

In order to understand our problem space, we implored three research methods to identify our primary stakeholders:


Focus on major topics related to social media policies on minors as users, content moderation, and harms

Quantitative research of Cornell University undergraduate students' early exposure to social media

Qualitative research of relevant groups for varied experiences with social media

We conducted literature review to identify the primary stakeholders on minor safety on social media platforms. After thorough analysis, we defined our problem space which included four major user groups. Each group faces unique issues and needs to consider for understanding the user problem:


After finishing our literature review, our team moved forward with our user research to gather both quantitative and qualitative data. We began with a survey which was conducted with the Cornell University undergraduate population.


We reached out to multiple student campus organizations, primarily through emails and messages through Slack to provide our survey to members interested in our research goal. We aimed to reach a broad range of students with diverse backgrounds to reflect on their early experiences with social media. At the conclusion, we collected 25 surveys which provided insights into how young adults interacted with social media at an early age.





The Solution

Early social media use was greatly affected by peer and parents' online behavior

Acknowledge social media is important to their teen's social life

Big Brother/Big Sister Mentorship Programs

Active Mediation: directly approaching children and teens about the serious risks in social media spaces

Restrictive Mediation: parents & guardians directly monitoring screentime and app activity

Social media Safe Havens with resources for developing healthy coping mechanisms

Content filtering to avoid explicit content

Support groups for marginalized identities

Unique browsers & apps exclusively for teens & young users

Social Media Webinars &

Resource Guide

Encourage Gamifying Hobbies

Phones are an important means of immediate communication especially for emergencies

Actively used parental controls to moderate their children's screentime

Believe they should be actively supervising their kids' online behavior rather than the government

Cognitive dissonance between actively participating in cyberbullying at a young age and their consequences

Found it very difficult to initiate conversations and educate social media safety with their older kids 

Local law enforcement or youth services provide no help against anonymous bad actors

The absence of educational programs & workshops made it hard to rely as a resource to promote online safety

Actively rebelled against parent intervention viewing it as unreasonable or an invasion of privacy

Viewed social media as the best ways to connect with their friends outside of school



Network effects heavily influenced which social media apps were the most popular



Semi-Structure Interviews

After gathering our major insights from our user research, we developed our solution space to determine compelling design ideas for our user problem. We centered our solution ideation on the How Might We framework. This framework was also crucial to keep our potential solutions within the legal, ethical, and technological limitations of our problem space.



Student Interviews

Parent Interviews

Opportunity Areas

Intervention For Helping Teens & Young Users Is Extremely Difficult

Solution Ideation

How Might We…

  1. Help educate young users to independently navigating online spaces?

  1. Help young users develop healthy coping strategies for viewing explicit/traumatic content?

  1. Provide immediate intervention to help young users during crisis moments (cyberbullying, grooming, hate speech, etc.)?

Increased need for mature conversations with their kids growing up with popular social media apps

At the conclusion of our surveys, we provided inquiry forms to recruit participants interested in qualitative interviews. We were able to screen for interviews with both undergraduate participants and parents.


We conducted 12 semi-structured interviews between 6 parents & 6 students in order to understand multiple perspectives on social media usage among teenagers. Here are the major takeaways from both interviews:

Did not learn how to effectively deal with bad actors engaging in cyberbullying, harassment, grooming, etc.

Current strategies are failing to keep up with the rapid integration of social media in early adolescence. Young users use social media very early and are attracted to the most popular apps. These platforms must balance moderation between their larger userbase leaving young users vulnerable. As such, teens also struggle to talk to their parents/guardians on how to use social media and only talk about their negative experiences. 

Young users need help learning social media literacy because knowing how to navigate online interactions is crucial for their personal development, but lack a proper support system to deal with online harms such as harassment, cyberbullying, and avoiding bad actors

We continued our solution space with rapid ideation for potential design solutions. Our team established over 100 design solutions through rapid brainstorming where we leveraged both technological & non-technological solutions. We also considered possible integrations with our main stakeholders such as educational PSAs, bootcamps, report systems, etc.


After careful consideration, we decided to continue iterating on a digital solution to focus on our user group: young users/minors between the ages of 10-16 years old. We further developed our How Might We questions into actionable design goals in our solution space.


Customizable Interactive Chat Bot

Access tailored support with a personalized A.I. chat box, offering customizable features to enhance your digital journey.

Screentime Monitoring

Take control of your screen time with our intuitive screen monitoring tool to combat negative effects of phone addiction and lower online user engagement.

Online Risks Awareness

Empower yourself with essential knowledge and skills to confidently navigate online risks, guided by our comprehensive educational resources and insights.

Design Goals

Encourage social media literacy to help young users properly navigate social media platforms


Improve young user self-image against social media expectations with coping strategies


Provide immediate aid to help young users deal with intense situations with bad actors


With these design goals in mind, we considered a friendly environment through a mobile app which addresses these user issues. Additionally, we want multifaceted solution to which can navigate multiple social interactions that rapidly occur on social media platforms.



Our design must include:


1) Personalized screen time reminders for users encouraging healthy balance while online positive

reinforcement tailored to individual usage patterns

2) Deliver real-time safety tips based on post activity by recognizing signs of cyberbullying,

identifying misinformation, or better understanding privacy settings

3) A feature for parents to receive summarized reports on their child's interactions with social

media to ensure transparency and facilitating open communication




Introducing KareBare:


A digital companion dedicated to empowering young users with the tools needed to responsibly navigate the online world. By combining screen time management features, reporting capabilities, and an accessible friendly user interface, KareBare contributes to creating safe online experiences for teens:


Low Fidelity: Concept Sketches

Visual Analysis

Before moving onto our high-fidelity prototype, we conducted visual comparisons between chat windows for both popular social media apps and mental health apps.




We created lo-fi prototype with whiteboard sketching for quick and rapid iterations for each sketch. We decided upon the general user flow for navigating between the social media app and KareBare.

After discussing with our design consultant, it was mostly easy to follow but needed improvements to further clarity on the different features and user flow.

Our Visual Design Inspo

Visual Competitive Analysis: Messenger

Visual Competitive Analysis: Headspace

Approachable

Cohesive

Calming

Accessible

Interactive

Busy

Interactive

Alerting

Emotive


Visual Competitive Analysis: Line

Our KareBare A.I. acts as an app extension for both IOS & Android devices. This login helps customize the chatbot for the user’s preference. This personalization helps the user naturally engage with KareBare.



Onboarding

KareBare on Home Screen

Introduction Screen

The onboarding process shows users information to help customize the KareBare A.I to their liking. In turn, KareBare will cater its conversations to the users' interest rather than generic messages.



High Fidelity Prototype


Finally, we developed our high-fidelity prototype from our consultant feedback, visual design inspirations, and further evaluation on our design constraints. We built upon prototype with further user testing for usability:


Screentime

KareBare sends personalized screen time reminders to users encouraging them to take breaks, and maintain a healthy balance between online & offline activities. These reminders are tailored to individual usage patterns, incite convos based on past texts& recommend hobbies or activities for a positive feedback loop.

The screentime alert icons are gentle reminders which is reflected in the quirky panda expressions. The notification color indicates how much scrolling the user has done compared to the recommended amount.



Chatting with Kare

Feeling insecure after scrolling through perfect pictures on Instagram? Talk with KareBare about how you feel. Using past conversations, KareBare develops empathetic responses and curates helpful info to cater to the user’s needs for their mental health.



Approaching Bad Actors

Parent Log

Young users are under immense stress facing anonymous dangers across different social media platforms. Get immediate help when facing a bad actor with KareBare. Provide a link to the post and our chatbot will recommend the best strategies to deal with these encounters.



Our app incorporates a Parent Log feature, providing invaluable insights into children's online activities and well-being. By offering regular reports on screen time, social media interactions, and potential instances of harassment or cyberbullying.




Parents can stay informed and initiate crucial conversations with their children about responsible digital behavior.

This feature encourages open dialogue while enhances child safety by empowering parents to take proactive measures.



Screentime Notification

Building Trust with Teens: Chat Window

Building Trust with Teens: Parent Intervention

We evaluated our design with the participation of our 2 consultants  in A.I. Research & Machine Learning, and 4 freshman college students as end users. Most participants were able to successfully navigate our prototype scenarios. Here are the major results of our user testing:

Based on feedback from both our evaluators and end users, the signifier for screen time notifications was too easy to miss. We changed the icon for the KareBare chat box to make it more accessible.

Make KareBare more proactive when checking in with users: We incorrectly assumed the chat box can only interact when the user wants to interact with its features. By including opportunities for KareBare to initiate conversations, build a positive relationship to become more trustworthy, and encourage better intervention during crisis situations.

Based on feedback from our evaluators, parents should not be able to view the exact text record between their kids & KareBare, We addressed these concerns by having KareBare to tell parents about conversation topics rather than full disclosure to ensure user privacy.

KareBare felt very supportive when helping with bad actors

The answers & resources provided by KareBare felt authentic and practical

Customizing KareBare is very easy to change to user preference

More proactive check-ins from KareBare needed -- conversations with the A.I. felt too stiff 

Screen time notifications from the app were very unclear

Direct messages between the teen and chatbox should be private from the parents to build trust.

Old Version

New Version

User Testing

TLDR:

Future Consideration

By instilling essential digital literacy skills & promoting responsible online behavior. KareBare empowers teenagers to navigate the complexities of social media with confidence. Whether it's combating bad actors, managing screen time, or fostering positive self-image, KareBare is their trusted companion.

Due to the nature of our design solution being centered on A.I. integration, several considerations must be made for both ethical and practical use of KareBare:

Legal Ramifications For A.I. Regulation

The current chat bot models such as ChatGPT have shown multiple cases of exacerbating negative users' behavior including self-harm & suicidal thoughts. Our chat bot will need to be heavily regulated to adjust KareBare for unintended consequences with legal consequences to enforce these policies.

User Privacy

The minimum requirements for maintaining user privacy would be through encrypted messages and anonymous user IDs to prevent major security risks. Maintaining these restrictions will help make a healthy environment for our user group.

Need For Human Interaction

Our KareBare features for helping teens develop healthy coping strategies will need to be integrated with mental health professionals including qualified social workers, therapists, and counselors. Additionally, integration with mental health apps should be considered for a holistic user experience.