Background

robin-worrall-749755-unsplash.jpg

Getting news from preference-driven social media…

According to Pew Research Center, in 2016 62% of American adults consumed their daily news on social media.

This is akin to feeding your brain news fast food of questionable information, sources, and quality.

Picture3.png

Creates echo chambers filled with fake news…

Now, combine this with the fact that social media feeds are filtered by algorithms that cater our preferences and biases.

Not only are we consuming large amounts of biased information, we do so very rapidly and with little criticality in part because of how both our smartphones and our social media are designed.

Picture2.png

And heightens tribalism and national tension.

This unhealthy information ecosystem creates a homogenous view of the world and a high susceptibility to fake news.

This deepens the divide between individuals, increasing tribalism and heightening national tension to new levels.

 

How might we help young adults become less susceptible to fake news stories spread within social media echo chambers?


 

PROJECT DURATION
6 months

SPONSOR
Jake Zukowski, FJORD

TEAMMATES
Olivia Thom
Conor Kelly

 

DESIGN PROPOSAL

A healthier, well-balanced news diet

Screen Shot 2018-10-16 at 1.52.14 AM.png

Become a better informed and more well-rounded citizen by changing how you interact with news from different perspectives.

Qübe is a news delivery device that trains you to go beyond the headline, get in the habit of checking news sources, and engage with ideas outside your echo chamber.

It’s a new way to news!

 
 

Key Features

Burst your news bubble.

Qübe shows you 3 articles on the same topic, each with different biases.

Get better at detecting bias.

A mini-game challenges you to guess an article’s bias before you know it’s source.

 
 
gif.gif

Set news goals.

Qübe’s companion app links to your cube and lets you set goals for yourself.

gif (2).gif

Learn about your news habits.

Stats provide transparency and clarity on how you consume news over time.

 
 
 

 

RESEARCH

Understanding a messy problem in flux

Our team began with a literature review that scrutinized the nature and history of public participation. We found that the largest political movements happened when there is a sense of empathy, agency, and a community-recognized enemy (either physical or philosophical) that galvanizes a population.

That meant that access to quality information is a necessary component to healthy public participation, and the degradation of quality information (for example, fear mongering and propaganda) can lead to either a decrease or misappropriation of civic engagement.

Academia and public sentiment at the start of 2017 suggested a lack of information literacy on social media was at fault for the spread of fake news during the 2016 election cycle, and thus impacted the public’s perception and participation prior to election night. Fact checking on the part of the social media user was repeatedly suggested as the way forward.

 

Competitor Product Assessment

Screen Shot 2017-10-03 at 6.51.07 PM.png

In spring 2017, Facebook was the first social media platform to unveil a fake news tagging feature for its users. A blog post accompanied this feature educating users on how to fact check. We conducted a task analysis and found several areas of improvement. 

What we learned...

  • Missing benchmarks - There is no way for people to know their history of engagement with fake news on Facebook.

  • Skill and knowledge is assumed - Users must be able to accurately determine if a post is fake news prior to reporting it as fake.

  • Useful information is missing - There is no way of knowing if a post is currently being investigated by Facebook prior to it being tagged as fake news.

  • Reliant on crowdsourcing - The tagging system won't work if not enough people flag content as fake.

 

Expert Interviews

We turned to experts in the fields of information literacy, civic online engagement, crisis informatics and disinformation, social media algorithms, and digital education to understand more.

Through these conversations, we hoped to understand if fact checking was indeed a viable solution, and if not, what other opportunities we could explore.

 
  Kate Starbird, PhD   Assistant Professor, Human Centered Design & Engineering, University of Washington.

Kate Starbird, PhD

Assistant Professor, Human Centered Design & Engineering, University of Washington.

  Wael Ghonim   Internet activist and computer engineer with an interest in social entrepreneurship.

Wael Ghonim

Internet activist and computer engineer with an interest in social entrepreneurship.

  Sarah McGrew   PhD candidate, School of Education, Stanford, current.

Sarah McGrew

PhD candidate, School of Education, Stanford, current.

  John Bracken   VP of Media Innovation for the Knight Foundation.

John Bracken

VP of Media Innovation for the Knight Foundation.

  Robin Roemer   Librarian, Reference & Research Services, University of Washington.

Robin Roemer

Librarian, Reference & Research Services, University of Washington.

 

What we learned….

The problem is complex and experts are split on the best response

  • The “fake news” crisis is more of an information crisis, where large populations are affected by purposeful and orchestrated misinformation. This can have significant implications for society.

  • The average person probably won’t go out of their way to become a better reader or consumer of news. Inspiring people to care about what they’re reading so they critically engage with the information is unlikely to succeed.

Education is one possible avenue, but application of knowledge is tricky

  • Educators are not prepared or knowledgeable enough about technology to try and take on the open internet.  

  • There are structured frameworks for fact checking that librarians, academics, and journalists use to determine the legitimacy of sources. These are highly specialized fields that require practice to become proficient.

Technology and truth have an intricate relationship that is impacting journalism

  • People inherently trust algorithms without understanding how they work. The mobocratic nature of social media platforms’ algorithms influences the content that spreads by giving a voice to the person yelling the loudest.

  • Because truth is subjective, all news that comes out is inherently untrustworthy from some stand point. News companies are in the best position to respond to the user-driven demand for upgrading the journalism system.

 

User Research

We wanted to see for ourselves how the experts’ information was playing out for undergraduate students. To round out our understanding of how misinformation, news consumption behavior, and social media was impacting this group, we created a plan to conduct generative research.

Target Audience

  • College undergrads who are daily users of social media

  • Avg. Age: 20

We chose this target audience for a variety of reasons. They were a group the Stanford History Education Group called out as being unprepared for parsing social media content.

We also chose this group because we’d have easy access to a large, politically- and culturally-diverse population with a high rate of social media usage. Undergraduates also had a higher likelihood of first time public participation in the 2016 election process, which was heavily impacted by fake news spread on social media.

What fact checking strategies, if any, are most appropriate for college undergrads to discern the validity of information found on social media?

Together these results suggest that [undergraduate] students need further instruction in how best to navigate social media content, particularly when that content comes from a source with a clear political agenda.
— Stanford History Education Group's "Evaluating Information: The Cornerstone of Civic Online Reasoning"
 

Observation & Interviews

Screen Shot 2017-10-03 at 7.31.17 PM.png

Number of participants: 6
Typical session: 30 minutes

We wanted to know…

  • How do undergrads normally use social media? Is it a vehicle for news for this group?

  • How do undergrads happen upon news in their lives?

  • When interested in reading the news, how do they seek it out?

  • What natural fact checking tendencies do they exhibit?

To ensure our target audience was viable for our research question, we observed undergrads' natural tendencies to read news on social media. This was followed by observing how they used social media to get a sense of the day’s news when prompted.

 

Task Analysis & Think Aloud

Screen Shot 2017-10-03 at 7.58.54 PM.png

Number of Participants: 4
Typical session: 1 hour

We wanted to know…

  • When interested in learning about a specific news topic, how do they seek it out?

  • When prompted, what methodology do undergrads use to fact check articles?

  • How does introducing a fact checking guide impact the participant’s ability to determine news source credibility and what does this do to the news experience?

We wanted to validate the idea of fact checking on the part of social media user. To understand how people do this without guidance, participants were asked to think aloud while fact checking a news article from a source they don't know. We then introduced a fact checking guide and asked the participants to check a second article from another unfamiliar source. 

 

Results

We generated insights from our research through affinity diagramming and whiteboarding. 

IMG_7616.JPG
IMG_7620.JPG
 

OUR INSIGHTS

Trust in news isn't based on logic - it’s a relationship

1. Emotional resonance: Undergrads trust news based on how it makes them feel, and thus are susceptible to the emotional manipulations of fake news which mimics sensational news.

2. Brand experience: Brand, politics, popularity, and customer experience contribute to an undergrad’s trust in a news source and subsequent reading habits.

3. Low-effort trust: Undergrads want minimal effort to trust a news source. Once a source has that trust, there is little effort needed to keep it.

Social media feeds bias through personalization

4. Bias amplification: The social media system is designed to catch & maintain attention by catering to personal preference, thus amplifying biases.

5. Trust in algorithms: Undergrads trust information curated for them by algorithms and do not question that trust.

Fact checking on social media isn’t the answer

6. Specialized knowledge: Undergrads are less likely to fact check if they don’t know anything about a topic, and news articles are not set up to make it easy for the average person to fact check.

7. Mismatched behavior: People skim quickly to read on social media and fact checking breaks the “flow” experience, requiring too much effort and time.

8. Overconfidence effect: People overestimate their natural ability to discern credible content as well as their ability to fact check.

 
 

Design Principles

Be ethical when gaining trust.

Insights: 1, 4, 5
System decisions like content curation/selection, information attributes, etc. are kept transparent to the user.

Cultivate positive emotions when engaging critical thinking.

Insights: 1, 2, 4, 7, 8
News consumption and critical thinking is elevated as a positive, delightful experience.

Be easy to do — and understand.

Insights: 3, 6, 8
The design language is approachable, fostering continuity and clarity.

Match information design to natural behaviors & context.

Insights: All
Natural behaviors and intuitive interactions match to the context of use: low-criticality environments create expectations of less cognitive lift, and vice-versa.

 
 

 

Ideation

Leading a horse to water 

Our research showed us the current state of social media as a non-critical thinking experience: quick flicks of the thumb to skim headlines, echo chambers with siloed perspectives, and dogged, unquestioning loyalty to news brands. We identified fact checking as the extreme opposite, requiring a slow and deliberate parsing of information, a willingness to seek multiple perspectives, and an objectivity to news sources.

Creating a social media fact checking solution wasn’t the answer, but we knew undergrads needed a way to hone lifelong skills important for identifying legitimate news sources.

Our sweet spot lay in the middle of the two criticality extremes: we decided to explore bias exposure, slowing news consumption, and news literacy education as the most valuable avenues of ideation.

Screen Shot 2017-10-04 at 11.40.37 AM.png
 
pick4 copy.jpg

Idea generation

Screen Shot 2018-10-15 at 9.01.15 PM.png

Over a two week period, we conducted 5 team ideation workshops and generated 130 ideas with our three design areas in mind. My favorite methods were Worst Ideas Ever and Wishing. I found that going to extremes really helped to break outside the creative comfort zone. To narrow, we held two sessions of dot voting.

 

Initial Concepts

 
  Statue of Insight   A visualization of your Facebook information habits through a moving, morphing physical statue.

Statue of Insight

A visualization of your Facebook information habits through a moving, morphing physical statue.

  myCube   A beautiful artifact for your space that reveals a news article every 24 hours.

myCube

A beautiful artifact for your space that reveals a news article every 24 hours.

  Opposites Attract   A system connecting people to discuss current events during meetup-style sessions at sponsored public locations.

Opposites Attract

A system connecting people to discuss current events during meetup-style sessions at sponsored public locations.

  The Knowledge Walkabout   Autonomous slow moving robots spout opposing opinions about a given topic/area in public.

The Knowledge Walkabout

Autonomous slow moving robots spout opposing opinions about a given topic/area in public.

  Empathy Jenga   A Jenga competition that facilitates conversations about political views and biases.

Empathy Jenga

A Jenga competition that facilitates conversations about political views and biases.

 
 
IMG_5417 (1).JPG

Refinement

We fleshed out our top five ideas to gain a better picture of the value each one brought to the problem space. After this, we mapped each idea to our design principles noting strengths and weaknesses. We also looked at the difficulty in realizing each idea due to limitations in time and resources.

Lastly, we assessed the desirability, feasibility, and viability of the fleshed out ideas. Our team agreed that two of our ideas - myCube and Statue of Insight - could be combined into one product experience.

 

 

Prototype & Test

Screen Shot 2018-10-15 at 9.18.47 PM.png

Cube and app UI

Once our decision was made, it was time to build prototypes. Olivia built 8 different cubes in various sizes and shapes, while Conor and I worked out the system diagram and information flow. Once we had that in place, I pumped out the paper prototypes for the cube UI and an Invision prototype for the companion app.  I wanted to keep our builds low fidelity to allow for fast  and agile iteration between participants when it came time to evaluate.

 
Screen Shot 2018-10-15 at 9.19.03 PM.png
Screen Shot 2018-10-15 at 9.21.36 PM.png
Screen Shot 2018-10-15 at 9.21.43 PM.png
It’s like fitness tracking for my news bias.
— Participant feedback

Testing & Results

We tested our cube ergonomics with 19 participants and the UI prototypes with 5 participants.

We wanted to know…

  • What cube size and weight are most ergonomic?

  • What are the pros and cons of showing each article’s source and bias on the cube’s article screens?

  • How do useful do participants feel the companion app’s stats screens are?

We found…

  1. Most participants liked the 3 in. to 4 in. cubes. 50% preferred the 3.5 inch cube.

  2. On the cube screens, some users strongly wanted to see source and bias information on articles, while others felt it would influence them to make snap judgements about the article. We decided one way to compromise was to have different screen modes to allow for the additional source information. However, time constraints meant we had to leave this feature in our “future directions” pile.

  3. On the app screens, one of the major iterations we made between participants was adding the concept of goals to the app. This was in response to a participant saying the app feels like the Fitbit dashboard. We found that adding goals cultivated positive feelings of improvement and reward.

 

 

Final Deliverables

Data & Dependencies

The system diagram was a collaboration with Conor that allowed us to continue assessing the technical and business viability of our product. We debated long about the use of a news bias rating team instead of an algorithm or other method that would reduce overhead. It was decided that the initial cost would be worth it if trust could be maintained by users in the long run, especially if rating methodology is readily available.

 
Screen Shot 2017-10-17 at 7.32.04 AM.png

Spec’ing out interactions

The interaction flow allowed us to map out the relationships between the cube in use, the cube at rest, and when the app was needed by the user.

In the future, I would use this flow as a starting point for a user journey map. There are specific touchpoints, such as the product’s onboarding, that could use further investigation.

 

Bringing the idea to Life

I used Sketch to create the high-fidelity UI for this project. Conor was instrumental in shaping the unique data visualizations on our dashboard, and Olivia helped me better communicate the meaning of shape and color in our product.

 

Spread the word

This product poster helped us communicate our design vision to the University of Washing DUB community.

 
Asset 6.png