Blog Post 2- Fiter Bubbles

                                         

Have you ever wondered why your social media feed feels completely tailored to your wants and needs? It's almost like the platforms know you better than you know yourself. Well, the truth is, they do, thanks to something called a filter bubble. A filter bubble is a term coined by internet activist Eli Pariser in his 2011 book, "The Filter Bubble: What the Internet Is Hiding from You" https://books.google.ca/books?hl=en&lr=&id=wcalrOI1YbQC&oi=fnd&pg=PT6&dq=eli+pariser+filter+bubble&ots=I4a9zqHzDr&sig=bNGOL04PkLaEnUvsuGSvQAFdiiY#v=onepage&q=eli%20pariser%20filter%20bubble&f=false . It refers to the personalized online ecosystem that algorithms create for us based on our past behaviours, clicks, and searches. Check out this Ted talk that discusses Eli Praiser's book or the link above to the book itself!



A Filter Bubble the bubble of content that we're exposed to on social media and search engines that are tailored to our interests and preferences. While it may seem like a good thing to have personalized content, filter bubbles can be harmful. They can limit our exposure to different viewpoints and create an echo chamber of our own opinions. This can lead to polarization and reinforce existing biases, as people may only see information that confirms their existing beliefs. Furthermore, filter bubbles can also have real-world consequences. For example, during the 2016 US Presidential election, many people were shocked by the results because they had been living in a filter bubble, only seeing news and information that reinforced their belief that Hillary Clinton would win. They were completely blindsided by the reality of the situation because they had not been exposed to differing viewpoints. Check out this article explaining the concept of Filter Bubbles within an election by Will Rinehart, it helps further explain the concepts that I am discussing above. 

https://medium.com/@willrinehart/the-election-of-2016-and-the-filter-bubble-thesis-in-2017-51cd7520ed7



So, what can we do to burst our filter bubbles? One solution is to intentionally seek out diverse viewpoints and opinions. This can include following people and organizations that have different perspectives on social media and actively seeking out news sources that aren't aligned with our existing beliefs. I like to speak with friends on social media who have opposing opinions politically or socially than myself. This makes the filter bubble almost confused and allows us to see more than our own opinion. This quiz linked below will help you decide if you are in a Filter Bubble or not... 


https://www.pbs.org/independentlens/blog/do-you-live-in-a-news-bubble/ 


 Although filter bubbles on TikTok can help us only view things like makeup if that's what we are interested in or dance videos, on Instagram and many other platforms we can't just see what we want all the time, that's not how the world works. So, we must understand what our bubbles may be and make a change in what we see online and what we speak about in person. For myself, I find that politically my social media pages show me Liberal perspectives, and for fun show me makeup, dance, day in the life vlogs, and more. Although this seems fine, I have found that I do not get news or updates on any social media except when there is a protest that is broadcasted over TikTok. This has made me trapped in a bubble with limited to no perspective on other sides of politics or world news in general. SNL described the idea of everyone in their own bubbles online really well in this skit linked below. 





Although I can admit it is comfortable to speak about and stay in a realm you are interested in, it is not likely to be in the best interest of you as a person. There are so many sources out there to help us navigate the world and what kind of bubble we have found ourselves in online. Throughout this blog please look through the articles and videos that I have suggested, as well as the two other sources of information linked below. 


Definition: 

Filter bubble: Created when news and information are filtered by algorithms, keeping people within a bubble of similar views – i.e., algorithms-driven. (Chen, Michelle. 2023. Brock University. “ Trapped by our digital footprints”). 

Echo-chamber: When we surround ourselves with information and viewpoints that supports our worldview – i.e., user-driven (note: we can have echo-chambers without algorithms or tech) (Chen, Michelle. 2023. Brock University. “ Trapped by our digital footprints”). 



Extra Info sources: 

Big Think Video: How news feed bias supercharge confirmation bias: 

https://www.youtube.com/watch?v=prx9bxzns3g&t=2s 

Habitual Generation of Filter Bubbles: Why is Algorithmic Personalisation Problematic for the Democratic Public Sphere?

https://www.tandfonline.com/doi/full/10.1080/13183222.2021.2003052

Comments

Popular Posts