ChromeView: A Plugin Designed to Fight Selective Exposure
"Algorithm models and machine learning systems. Whether we realize it or not, our daily lives center around them," explained a team of Carnegie Mellon University undergraduate students in their paper for Jason Hong's Social Web course. From recommending additional purchases to trading stock on Wall Street, algorithms help us navigate the overwhelming amount of information and decisions in front of us. However, they may be more involved in our daily lives than people care to admit.
The HCII undergraduate team described these often invisible algorithms as the "systems that curate the world's information for us," and looked in particular at Facebook's News Feed as the inspiration for their plugin, ChromeView.
A Plugin for Selective Exposure
The team of students, consisting of Benjamin Jang, Min Noelle Kim, Zeeshan Rizvi, and Ryan Sickles, decided to investigate the ways in which Facebook's algorithm may have been a little too successful at delivering only desirable content to its users.
"We learned how [Facebook] for the sake of catering to their users, creates an ideological bubble that traps the user by feeding them only one point of view," they wrote.
Facebook's platform functions by connecting people with the information that resonates with them, the group continued. In the case of politics however, these "filter bubbles" reinforced users' biases by delivering articles with similar political leaning into their feeds.
The social web team hypothesized that an involuntary lack of exposure to dissimilar viewpoints drove up political tension. They decided to explore how a technical solution might be able to more easily expose users to differing viewpoints while also helping people understand how frequently (or infrequently) they consume content with which they disagree.
The plugin, ChromeView, was designed to do this in two ways. First, it incorporated a news recommendation modal, which displays when a user hovers over an article on the Facebook News Feed. When a user with the plugin installed moves over an article, the recommendation view displays similar articles from other news sources.
The second tactic was a visualization of the user's political leaning, which the team called a political leaning meter. The political meter was meant to help motivate users to research both parties' viewpoints, rather than only consuming content with which they already agreed.
"Social media has thrust politics into our daily lives and has convinced us that there are two sides, which we need to choose from," said Zeeshan Rizvi. "Our goal with this project was to help show our users that there is another side by exposing them to their views."
How to Share Both Sides of a Story
To develop the plugin, the students tested their lightweight, paper prototypes in a study of five participants. After receiving positive results, they moved on to developing working prototypes.
They used an agile approach, developing the application features, testing and then reviewing results before iterating again. They developed the plugin using HTML 5, CSS and JQUERY 3.2 and also included the Lateral.io news recommender and Google Chrome Brower APIs. The result is a simple interaction that easily allows users to access multiple news sources for any given article on the feed.
"This project was important to me because I believe the simplistic solution our team developed is an effective way for people to expand their angle on media," said Ryan Sickles.
He continued, "It's even more important that our team tackled this issue on Facebook because the platform has become one of the most favored portals for news."
Team member Min Kim also cited Facebook's growing presence in media distribution as an important motivator for their project.
"I'd like to see social media sites take journalistic responsibility for what they've become in recent years with the growing sophistication of machine learning," Kim said.
"Maybe with more projects like this, they'll start to take notice and recognize that it's a powerful and important role, and with it comes great responsibility." She hopes that their work, and work similar to it, will be a catalyst for companies like Facebook to take a more ethical approach to designing platforms and the machine learning behind them.