Case Study

Balancing News Bias on Facebook

A system that allows Facebook users to balance their news bias and see the bias rating for their current and upcoming news consumption.

View Presentation

SCROLL

University

Project Type & Deliverables

Final Project (MS HCI 1st Semester)

Interaction Design

UI Prototype (Invision)

Visual Design (Sketch)

Project Details

My Role — Product Designer

Project Duration — 6 Weeks

Team — Ashish Durgude, Madison Anderson, Heidi Bloesch

Project
Background

What are echo-chambers and filter bubbles, and how it leads to news bias?

When we like to read a certain type of news, whether it's conservative or liberal news, that becomes our comfort zone, and usually, we don't wish to get out of it. We share that news with our friends' circle and with our followers on social media. Most of the people we follow will like our shared news post because that is our echo-chamber or filter bubble means having people with similar interests. These people will also share the news that we want, and that leads to news bias.

But we should know that there is another side to almost every news, and there are other people with different news preferences. (Multiple Echo-Chambers and Filter Bubbles)

Why Facebook app?

Facebook is a widely used social media platform in the entire world. Echo-chambers and filter bubbles often form on Facebook because the algorithm that underpins the platform reinforces the user's confirmation bias by continually recommending news and information that aligns with their past activities.

What was my task/work?

As a Product Designer (MS HCI Student), my job was to work with teammates from the phase of defining a problem to the evaluation of a prototype with the users. For this project, I did the entire visual design and prototyping.

Problem Statement

How might we give Facebook users exposure to news and information outside of their echo-chambers and filter bubbles so that they can make better decisions?

Solution

A system integrated into the Facebook app that allows Facebook users to balance their news bias and see the bias rating for their current and upcoming news consumption.

Skip to Design Process

Key Features

Media Balance Scale

Media Bias rating system by allsides.com to show the type of news users are reading. (L=Left, L=Lean Left, C=Center, R=Lean Right, R=Right)

Bias Board - Dashboard

The dashboard for showing overall ratings on the news which users have consumed so far. It allows users to balance news bias using the media balance scale.

Bias Board - Balance your bias

If users want to balance their news bias, they can do it by dragging the blue circle component towards "C." It will recommend news pages to follow to centralize their news bias.

Want to experience above key features by yourself?

No worries!! Have a look at InVision prototype

https://invis.io/JEYDNQMWVSX

Design Process

Research

Background

As a final project for our Interaction Design Practice (IDP) class, We needed to choose a broad topic that addresses real human problems. We wanted to work on areas that focus on disinformation, misinformation, and Echo-Chambers on social media".

Disinformation and misinformation

According to yonder.co, Disinformation is false information deliberately and often covertly spread in order to influence public opinion or obscure the truth. Misinformation is incorrect or misleading information inadvertently sent in order to influence public opinion or obscure the truth.

Why do we think this is a problem?

64% of U.S. adults believe that disinformation has caused a great deal of confusion about the basic facts of current events, and 23% said they had shared fabricated political stories themselves – sometimes by mistake and sometimes intentionally.

Pew Research Center.The Future of Truth and Misinformation Online.2017

Confirmation bias helps to account for users’ decisions about whether to spread content, thus creating informational cascades within identifiable communities. Since they focus on their preferred narratives, users tend to assimilate only confirming claims and to ignore apparent refutations.

SSRN Journal.Echo Chambers on Facebook.2016

Findings from user interviews

We did six user interviews to know if they faced news bias on Facebook.
03

Facebook users who visit Facebook every day

03

Experts who have knowledge about news bias

Overall, the major themes that emerged from the participant interviews and informed our design direction were, Many people have become distrusting or skeptical of Facebook as a news source. Older people tend to believe Facebook is biased against their political leanings and are further affirmed in this notion from the media they consume on the platform.

Findings from secondary research

A couple of significant findings emerged from the secondary research. One key theme was that mistrust and disinformation are much more complex issues than we initially realized. Through our secondary research, we found that the algorithms used by Facebook to curate relevant content for the user have also created strong echo chambers and filter bubbles.

Wireframes

After defining a problem and doing research, we started working on concepts for the media balance scale and bias board. Here are the final paper prototypes/wireframes that we worked on.

Visual Designs

After finalizing the wireframes/paper prototypes, we started working on visual design. We are not redesigning Facebook. We integrated our designs/concepts into the Facebook application. Here are the final visual designs.

Evaluation

Usability Evaluations with Potential Users

01.

All of the users who evaluated the product felt that it was a unique and effective solution.

02.

The primary concerns that they had were around the transparency and neutrality of the assigned ratings. Even with questions on transparency, the users felt that this tool could allow them a more informed and well-rounded opinion on political topics with friends and family they may disagree with politically.

03.

Users were skeptical about allowing Facebook to make a judgment call and determination about the political leanings as they do not view Facebook as a neutral party in the current political discourse.

04.

Looking ahead, it seems that being able to convince users to trust in the rating system would be the biggest barrier to product success.

Usability Evaluations with Experts

01.

The key takeaway we had about task errors was that our attempts to integrate our new features into Facebook seamlessly made the new features blend in a bit and became difficult to locate. 

02.

Perhaps including a “new feature tutorial” would be necessary to bring initial awareness to users. This tutorial would allow the features to remain seamless with the Facebook interphase normally but be brought to the user’s attention at the initial introduction of the feature.

What would I do differently if I get the opportunity?

Since we had a limited time with specific scheduled tasks and assignments each week, we focused only on one feature, 'giving the user an ability to balance their news bias.'

During the evaluation, Users said that they liked the solution since they have full control of changing news to follow/unfollow, but that doesn't mean that they would use this solution. 'Liking a solution' and 'using that solution' are different things.

So if I get a chance to do this project again, I would study/focus on whether users are using the feature using metrics like User conversion rate, MAU, etc.
(DAU metric may not be useful since people don't follow/unfollow news page every day).

Thank you for reading this project

Also, check out these awesome projects

Improving Usability of Entity Browser

UX Design Internship @ Nutanix (Summer 2020)

Stack Report for OpenShift.io

Red Hat