Echo Chambers: The Phrase on Everyone’s Lips at #SXSW 2017

636139036607757448441601194_2015-05-12_facebook-likes (1)

 

by Jessica Bedussi

SXSW 2017 is upon us, which means thousands of advertisers, techies, and industry thought leaders swarm to Austin, Texas for “nerd Spring Break.”

 

While there are dozens of panels ranging from the future of Snapchat to the effects of VR — and even one where the attendees watch a grown man eat chicken for an hour — one buzzword kept leaving people’s lips: echo chambers. Whether it was Imgur’s founder, Alan Schaaf, discussing the lack of an echo chamber in his community, behavioral scientists blaming cookies (not the fun kind) for the phenomenon, or neuroscientists describing the effects on our brains, the term seemed to be omnipresent.

 

One of the more valuable sessions on echo chambers was from Claire Woodcock, a digital strategist at Razorfish London, who took the buzzword a step further by analyzing its origin, effects and potential solutions.

 

What exactly is an echo chamber?

If you work in social media advertising, algorithms have most likely kept you awake since 2010, when Facebook introduced its newsfeed algorithm Edgerank. Since then, this mathematical formula has evolved into a machine-learning algorithm that prioritizes and presents content to users based on a list of factors like what they have engaged with, what groups they’re a part of and other macro factors like the type of content Facebook is prioritizing at the moment.

 

All this means that Facebook users are given a narrow view of the world that reinforces their existing values and beliefs. We start to exist in online communities where we shout our opinions and have the same ideas reflected back to us. We don’t have to deal with disagreements or differing points of view. This doesn’t seem like a bad idea, but when we realize 62% of people get their news from social media and 28 percent of those people use it as their main source, the idea becomes much more sinister.

 

How are echo chambers made?

Facebook uses their algorithm to learn what you like and what you don’t like. Woodcock put this in relatable terms with examples of cute puppies and kittens. For instance, let’s say you’re a dog person. You scroll through your NewsFeed and see this cute photo of a pug and give it a like.

Screen Shot 2017-03-15 at 3.35.49 PM

 

You continue browsing your NewsFeed and see a photo of a cat. Cats don’t excite you so you keep scrolling until you stumble upon another cute dog photo. This one is so cute it deserves a comment. *hearteye emoji*

Facebook takes these actions and learns you like dogs and continues to give you exactly what you want. The platform decides you prefer dogs to cats and stops showing you images of cats knowing you most likely won’t engage with them.

 

This is where the echo chamber starts to form. You’re not exposed to cats or cat people anymore and are instead living in a bubble where dogs, and those who love them, are the only things that matter.

 

Put this into today’s terms and it’s much easier to understand why some people were so surprised when Brexit or Trump’s election occurred and how fake news has become so rampant. When someone who dislikes Obama sees several reputable articles they disagree with, the Facebook algorithm can prioritize a fake news site with similar negative sentiment. Since this individual already dislikes Obama, it’s much easier for this to seem like a reputable source.

 

We start to exist in a polarizing world where truth doesn’t matter — what matters is being right. We stop talking to each other. We stop learning different perspectives. And everything that makes public discourse so important and valuable starts to go away. As Woodcock put it, “social media is a disruptive communications technology.”

Who do we blame and how do we solve it?

It’s Facebook’s fault, right? Not entirely. This problem is much more widespread than that, and solving it will require more than just Facebook tweaking its algorithm.

 

What needs to happen is more opinion diversity so we’re all exposed to different perspectives and opinions.

 

Woodcock proposed that Facebook take a page from Netflix. To get users out of their comfort zone, Netflix finds themes in content that users view and suggests different media with similar themes. For instance, after watching “Chef’s Table,” Netflix recommends I watch “Abstract: the Art of Design.” The real world example given during the session was to show a conservative news story that appeals to both conservatives and liberals. Instead of a polarizing headline like “Mexicans are coming for our jobs,” the language is massaged so liberals are more likely to click on the link: “What makes crossing the border appealing for Mexicans?”

 

The idea is to slowly and subtly introduce different viewpoints into the NewsFeed instead of regurgitating the same thoughts and ideas.

 

We can point fingers and place blame all we want, but we can all agree: something must be done to burst these bubbles.
How do you think we can solve the echo chamber problem?