Echo chambers have caused more miscommunication than productive discourse. Imagine an elementary playground where the only group of kids are exposed to what the leader says and does. They mimic the behavior and comments the leader displays. There are no other groups of kids playing except them. The kids start to bounce the same ideas with each other rather than get a perspective outside of their group. It reinforces what the group thinks. That is what an echo chamber creates. It creates “filter bubbles” in discourses that lack direction.
Echo chamber behavior can be found both online and offline. The term “echo chamber” was first coined as a phenomenon only caused in political settings. As more of our daily routines revolve around social media, echo chambers can be found virtually anywhere. It doesn’t have to be linked to politics. It can happen in social commentary and entertainment conversations. In this three part series of echo chambers, this article will discuss digital echo chambers.
As an avid Twitter user many months ago, I began to notice how misinformed my curated feed became. Every time a user interacted with someone who challenged their ideas, the user became conceited and never provided a direction to the topic discussed.
Within echo chambers, filter bubbles begin to breed and spread out. A filter bubble is directly linked to a social media’s algorithm. Social media companies make users the product rather than a consumer. Social media company workers make algorithms to hook users as much as they can. Every click a user makes, it gets stored in the algorithm. The algorithm makes a profile based on the user’s consumption habits. These habits can be how long does a user interact with posts they see. Interaction can be liking, commenting and sharing. It can also be following similar accounts from the posts they interact with.
Once the algorithm makes a detailed profile of a user’s consumption habits, the algorithm will subtly suggest content the user will most likely click and spend their time on. It will do what it takes to engage a user more than their previous visit. That is what is designed for. Social media companies call this their “attention-based” (AB) business model mixed with their “user-generated content” (UGS) business model. Both work hand in hand to increase activity in the user and increase personalized content.
Based on this premise, filter bubbles are caused by user-generated content. Suggestions of similar engaged content will appear in a user’s explore page or for you page. Filter bubbles are dependent on the UGS system. If a user doesn’t interact with different sources of content, they won’t see it pop up in their feeds. Every user’s feed is uniquely different and will not mirror other users. It may show similarities and nothing more.
Digital echo chambers have caused online cultural tribalism. As described by scholarly article titled, “Building Trust with Corporate Blogs”, it hints that cultural tribalism are “communities based upon interests and not localities might well reduce diversity”. In other words, groups that are unionized by common interests can reduce diverse perspectives because there is no voice besides their own.
Humans being unionized by common interests and goals is part of how humans operate. Humans are social creatures that depend on socialization to survive. Humans engage in endogamy where humans marry within their social group and don’t marry outside of their group. I propose that endogamy is not only subjected to marriage. Endogamy can happen in other social settings like in educational institutions, social gatherings, interpersonal relationships and workplace environments. In our 21st century reality, it can happen online. Staying within our own online niche is vital for our online survival. Being outcasted on social media via cyberbullying and profile suspension is the last thing a user wants.
That being said, this is in no way excusing trolling behavior and moral ignorance. Having moral principles is important for practicing discernment in our online consumption habits. Discernment can help us break cycles algorithms have put in place. Discernment is like a muscle. Individually, we must periodically work out that muscle in order to gain higher insight about ourselves, our habits and the concepts we are internalizing.
Arguing about whether an artist is a flop or bop and harassing users that don’t agree with us is counterproductive. Arguing about human rights like racial equity and cultural tolerance are worth having because lives are on the line. Morality is on the line.
Despite these key differences, both scenarios can fall victims to echo chambers. To combat echo chamber traps, do your own research! Take information you see on social media with a grain of salt. Google topics that are being discussed. Use credible news outlets like BBC, The New York Times, The Washington Post, ABC News or NBC News. Research the political leanings of news publications. If it’s related to politics, some business publications are bipartisan. Bipartisan publications can give a more diverse outlook on developing topics.
Use fact checking websites like FactCheck, Snopes or PolitiFacts. Fact checking websites that are non-partisan or rely on donations to run the content have a wide catalog of information available.
Researching topics from news outlets outside of one’s country can also give a better scope of a developing topic. It can show insights on how news is covered based on vocabulary and timely reporting. Usually, this is reserved for topics falling under politics or human rights issues.
If a topic discussed on social media only offers one perspective, has loaded language, lacks reliable evidence (if a user sends another user an article from a partisan publication or a website without contact information) and is emotionally charged, that means it is part of an echo chamber. Media literacy and discernment are imperative tools that can be used to combat echo chambers.
Paola is a collector of all trades. She loves philosophy, music, design, writing, sociology. Living in Miami, she feels at home next to a palm tree and a couple of mojitos. As a Digital Communications/Sociology undergraduate, she incorporates her writing and design with sociological teachings. She is a big believer in duality and hopes to be a powerhouse in what she does. Instagram: @paology_
More of Paola’s work in Mixed Mag:
The Third Korean Wave (Hallyu 3.0) & its Cultural Significance in the U.S (Issue 6)