Assess the Evidence for and Against the Existence of ‘Filter Bubbles’
DOI:
https://doi.org/10.62051/ijsspa.v8n3.02Keywords:
Filter Bubbles, Algorithmic Personalization, Information Diversity, Confirmation BiasAbstract
This paper critically assesses the evidence for and against the existence of "filter bubbles"-a phenomenon where algorithmic personalization limits users’ exposure to diverse information. Drawing on theoretical discussions and empirical studies, the paper explores how user behavior, platform algorithms, and business models contribute to the formation of filter bubbles. While personalization can enhance user experience, it may also reinforce confirmation bias and lead to ideological homogeneity, particularly in political contexts. However, opposing views highlight the lack of consensus on definitions and insufficient empirical support for widespread filter bubble effects. The study concludes that although filter bubbles exist, their impact varies across platforms and users. Active user behavior and cross-platform engagement can mitigate negative consequences, suggesting that filter bubbles are not inevitable or irreversible.
Downloads
References
[1] Erickson, J. (2024). Rethinking the filter bubble? Developing a research agenda for the protective filter bubble. Big Data & Society, 11(1), 20539517241231276.
[2] Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.
[3] Ross Arguedas, A., Robertson, C., Fletcher, R., & Nielsen, R. (2022). Echo chambers, filter bubbles, and polarisation: A literature review. Reuters Institute for the Study of Journalism.
[4] Areeb, Q. M., Nadeem, M., Sohail, S. S., Imam, R., Doctor, F., Himeur, Y., Hussain, A., & Amira, A. (2023). Filter bubbles in recommender systems: Fact or fallacy-A systematic review. WIREs: Data Mining & Knowledge Discovery, 13(6), 1–27.
[5] Bruns, A. (2019). Filter bubble. Internet Policy Review, 8(4), 1-14.
[6] Spohr, D. (2017). Fake news and ideological polarization: Filter bubbles and selective exposure on social media. Business information review, 34(3), 150-160.
[7] Haim, M., Graefe, A., & Brosius, H. B. (2018). Burst of the filter bubble? Digital Journalism, 6(3), 330–343.
[8] Geschke, D., Lorenz, J., & Holtz, P. (2019). The triple‐filter bubble: Using agent‐based modelling to test a meta‐theoretical framework for the emergence of filter bubbles and echo chambers. British Journal of Social Psychology, 58(1), 129-149.
[9] Sunstein, C. R. (2006). Infotopia: How many minds produce knowledge. Oxford University Press.
[10] Sukiennik, N., Gao, C., & Li, N. (2024). Uncovering the Deep Filter Bubble: Narrow Exposure in Short-Video Recommendation Proceedings of the ACM Web Conference 2024, Singapore.
[11] Jiang, T., Lv, Y., & Fu, S. (2024). Breaking the filter bubbles: The effect of algorithm curation on selectivity of information consumption and attitude extremity. Journal of Modern Information, 44(7), 22-33.
[12] Vaccari, C., Valeriani, A., Barberá, P., Jost, J. T., Nagler, J., & Tucker, J. A. (2016). Of echo chambers and contrarian clubs: Exposure to political disagreement among German and Italian users of Twitter. Social media+ society, 2(3), 2056305116664221.
[13] Ćurković, M., & Košec, A. (2020). (Re) search Filter Bubble Effect-An Issue Still Unfairly Neglected. Advances in Nutrition, 11(3), 744.
[14] Guo, X., & Gan, X. Y. (2018). Burst your bubbles: reflection on the formation and resolution of filter bubbles in an era of recommendation algorithm. Global Media Journal, 5(2), 76-90.
[15] Jiang, T., & Xu, Y. (2021). Filter bubbles-induced by personalized recommendation algorithms: A review of related research. Journal of the China Society for Scientific and Technical Information, 40(10), 1108-1117.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 International Journal of Social Sciences and Public Administration

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.






