Getting news from preference-driven social media…
According to Pew Research Center, in 2016 62% of American adults consumed their daily news on social media.
This is akin to feeding your brain fast food.
You read large amounts of questionable information. You don’t know what the sources are or where they come from. And you more than likely don’t have the training or time to determine if the news you’re consuming is quality journalism.
Creates echo chambers filled with fake news…
Now, combine this with the fact that social media feeds are filtered by algorithms that cater our preferences and biases.
Not only are we consuming large amounts of biased information, we do so very rapidly and with little criticality in part because of how both our smartphones and our social media are designed.
And ultimately heightens national tension.
The unhealthy information ecosystem we find ourselves in creates low engagement with diverse perspectives and a high susceptibility to fake news.
This deepens the divide between individuals, increasing tribalism and heightening national tension to new levels.
How might we help American citizens become less susceptible to fake news?
Our team dedicated our graduate capstone project to improving public participation. The rise of fake news during the 2016 election galvanized us to tackle this somewhat wicked problem.
Jake Zukowski, FJORD
Design: wireframes, interaction flow, interaction model, design specs, high-fidelity UI, project poster, DUB community slide presentation
Research: expert interviews, competitive assessment, primary research plan, conducted user observations and task analysis session
Synthesis: research insights, design principles, insight diagrams, and research report
Prototyping: paper prototype, digital prototype
Evaluation: conducted prototype test sessions, analyzed data to determine design implications
Become a better informed and more well-rounded citizen by changing how you interact with news from different perspectives.
Qübe is a news delivery device that trains you to go beyond the headline, get in the habit of checking news sources, and engage with ideas outside your echo chamber.
It’s a new way to news!
Burst your news bubble.
Qübe shows you 3 articles on the same topic, each with different biases.
Better detect bias.
A mini-game challenges you to guess an article’s bias before you know it’s source.
Set news goals.
Qübe’s companion app links to your cube and lets you set goals for yourself.
Learn about your news habits.
Stats provide transparency and clarity on how you consume news over time.
Our team began with a literature review that scrutinized the nature and history of public participation. We found that the largest political movements happened when there is a sense of empathy, agency, and a community-recognized enemy (either physical or philosophical) that galvanizes a population.
That meant that access to quality information is a necessary component to healthy public participation, and the degradation of quality information (for example, fear mongering and propaganda) can lead to either a decrease or misappropriation of civic engagement.
Academia and public sentiment at the start of 2017 suggested a lack of information literacy on social media was at fault for the spread of fake news during the 2016 election cycle, and thus impacted the public’s perception and participation prior to election night. Fact checking on the part of the social media user was repeatedly suggested as the way forward.
In spring 2017, Facebook was the first social media platform to unveil a fake news tagging feature for its users. A blog post accompanied this feature educating users on how to fact check. We conducted a task analysis and found several areas of improvement.
What we learned...
Missing benchmarks - There is no way for people to know their history of engagement with fake news on Facebook.
Skill and knowledge is assumed - Users must be able to accurately determine if a post is fake news prior to reporting it as fake.
Useful information is missing - There is no way of knowing if a post is currently being investigated by Facebook prior to it being tagged as fake news.
Reliant on crowdsourcing - The tagging system won't work if not enough people flag content as fake.
We turned to experts in the fields of information literacy, civic online engagement, crisis informatics and disinformation, social media algorithms, and digital education to understand more.
Through these interviews, we hoped to understand if fact checking was indeed a viable solution, and if not, what other opportunities we could explore.
What we learned….
The problem is complex and experts are split on the best response
The “fake news” crisis is more of an information crisis, where large populations are affected by purposeful and orchestrated misinformation. This can have significant implications for society.
The average person probably won’t go out of their way to become a better reader or consumer of news. Inspiring people to care about what they’re reading so they critically engage with the information is unlikely to succeed.
Education is one possible avenue, but application of knowledge is tricky
Educators are not prepared or knowledgeable enough about technology to try and take on the open internet.
There are structured frameworks for fact checking that librarians, academics, and journalists use to determine the legitimacy of sources. These are highly specialized fields that require practice to become proficient.
Technology and truth have an intricate relationship that is impacting journalism
People inherently trust algorithms without understanding how they work. The mobocratic nature of social media platforms’ algorithms influences the content that spreads by giving a voice to the person yelling the loudest.
Because truth is subjective, all news that comes out is inherently untrustworthy from some stand point. News companies are in the best position to respond to the user-driven demand for upgrading the journalism system.
To round out our understanding of how misinformation, news consumption behavior, and social media was impacting the everyday person, we created a plan to conduct generative research.
College undergrads who are daily users of social media
Avg. Age: 20
We chose this target audience because of easy access to a large, politically diverse, particularly busy population with a high rate of social media usage. Undergraduates also had a higher likelihood of first time public participation in the 2016 election process, which was heavily impacted by fake news spread on social media.
What fact checking strategies, if any, are most appropriate for college undergrads to discern the validity of information found on social media?
Method 1: Observations
Number of participants: 6
Typical session: 20 minutes
We wanted to know…
How do undergrads normally use social media? Is it a vehicle for news for this group?
How do undergrads happen upon news in their lives?
When interested in reading the news, how do they seek it out?
What natural fact checking tendencies do they exhibit?
To ensure our target audience was viable for our research question, we observed undergrads' natural tendencies to read news on social media. This was followed by observing how they used social media to get a sense of the day’s news when prompted.
Method 2: Task Analysis & Think Aloud
Number of Participants: 4
Typical session: 1 hour
We wanted to know…
When interested in learning about a specific news topic, how do they seek it out?
When prompted, what methodology do undergrads use to fact check articles?
How does introducing a fact checking guide impact the participant’s ability to determine news source credibility and what does this do to the news experience?
We wanted to validate the idea of fact checking on the part of social media user. To understand how people do this without guidance, participants were asked to think aloud while fact checking a news article from a source they don't know. We then introduced a fact checking guide and asked the participants to check a second article from another unfamiliar source.
We generated insights from our research through affinity diagramming and whiteboarding.
Trust in news isn't based on logic
1. People trust news based on how it makes them feel, and thus are susceptible to the emotional manipulations of fake news which mimics sensational news.
2. Brand and customer experience contribute to a person’s trust in a news source.
3. People want minimal effort to trust a news source. Once a source has that trust, there is little effort needed to keep it.
Social media feeds bias through personalization
4. The social media system is designed to catch & maintain people’s attention by amplifying their biases.
5. People trust information curated for them by algorithms and do not question that trust.
Fact checking is problematic
6. People are less likely to fact check if they don’t know anything about the topic, in large part because news is not set up to make it easy for the average person to fact check.
7. Fact checking breaks the “flow” experience of social media.
8. People overestimate their natural ability to discern credible content as well as their ability to fact check.
Be ethical when trying to gain a person’s trust.
Insights: 1, 4, 5
Design to keep trust at the surface of any product or experience we design in an ethical way.
Cultivate positive emotions when engaging critical thinking.
Insights: 1, 2, 4, 7, 8
Support the elevation of positive experences with regards to information consumption and critical thinking.
Be easy to do — and understand.
Insights: 3, 6, 8
Embrace a strong design language and keep continuity and transparency in all stages of the design process.
Don’t force a person to act counter-intuitively.
Keep realistic expectations for natural behaviors to help meet people where they are.
Meet people at their literacy level.
Insights: 1, 3, 5, 6
Curate and design the content of any design solution to the natural comfort level of our audience, supporting our other principles.
Keep decisions transparent and information provenance apparent.
Insights: 1, 2, 5, 6
People have a tendency to overestimate their own abilities based on assumptions, and transparency will help mitigate this.
The opportunity space
We knew solutions requiring non-critical thinking would not impact the problem space, but solutions requiring a large amount of critical thinking, such as fact checking, are also not appropriate. Our sweet spot lies in the middle, and there we identified bias exposure, slowing news consumption, and news literacy education as the most valuable avenues of exploration.
Over a two week period, we conducted 5 team ideation workshops and generated 130 ideas with our three design areas in mind. To narrow, we held two sessions of dot voting. First we determined which ideas were potentially the least desirable, feasible, and viable. On the remaining ideas, we voted on which ideas had the most desirable, feasible, and viable to narrow down to our top five.
We fleshed out our top five ideas to gain a better picture of the value each one brought to the problem space. After this, we mapped each idea to our design principles noting strengths and weaknesses. We also looked at the difficulty in realizing each idea due to limitations in time and resources. Lastly, we again mapped the ideas to desirability, feasibility, and viability, which showed us that two of our ideas could be combined into one.
Prototype & Test
Prototyping cube and app UI
Once our decision was made, it was time to build prototypes. Olivia built 8 different cubes in various sizes and shapes, while Conor and I worked out the system diagram and information flow. Once we had that in place, I pumped out the paper prototypes for the cube UI and an Invision prototype for the companion app. I wanted to keep our builds low fidelity to allow for fast and agile iteration between participants when it came time to evaluate.
Testing & Results
We tested our cube ergonomics with 19 participants and the UI prototypes with 5 participants.
We wanted to know…
What cube size and weight are most ergonomic?
What are the pros and cons of showing each article’s source and bias on the cube’s article screens?
How do useful do participants feel the companion app’s stats screens are?
Most participants liked the 3 in. to 4 in. cubes. 50% preferred the 3.5 inch cube.
On the cube screens, some users strongly wanted to see source and bias information on articles, while others felt it would influence them to make snap judgements about the article. We decided one way to compromise was to have different screen modes to allow for the additional source information. However, time constraints meant we had to leave this feature in our “future directions” pile.
On the app screens, one of the major iterations we made between participants was adding the concept of goals to the app. This was in response to a participant saying the app feels like fitness tracking for your bias. We found that adding goals cultivated positive feelings of improvement and engagement with bias.