A systematic review of academic research in the International Journal of Web Based Communities has looked at the relationship between the leading online video content sites and its recommendation system and how this might affect the circulation of polarised or misleading content. The review analysed 56 studies investigating how the platform’s algorithms interact with material including political disinformation, health-related misinformation, and extremist content.
The research indicates that algorithms optimised to increase user engagement can, in some cases, correlate with patterns in which viewers are exposed predominantly to content that aligns with their existing beliefs. This phenomenon is often referred to as the “echo chamber effect.” Some experimental studies cited in the review suggest that sequences of recommended videos may influence attitudes within specific demographic groups.
Political content was the most frequently examined domain for polarisation, though other types of potentially harmful material were also included. The review highlighted diversity in research objectives and method. Approximately half of the studies focused on misinformation, while smaller numbers addressed non-political radicalisation or online toxicity. Several studies have looked at how to model these dynamics in order to find potential strategies for mitigation.
The review identified several gaps in the literature. For instance, few studies have considered the role of monetisation or financial incentives in shaping recommended content. Multi-platform analyses are increasingly common, reflecting recognition that content originating on this major video platform can be shared across other social media and messaging platforms, extending its visibility beyond the platform itself.
Researchers emphasise the distinction between polarisation, where opinions may become more extreme, and misinformation, where inaccurate or misleading claims are shared. They also note the importance of considering algorithmic design, user behaviour, and economic factors together when assessing the broader societal implications of recommendation systems.
The platform in question has implemented measures, including policy updates and fact-checking initiatives, intended to address problematic content, though the review notes that challenges remain.
Almeida, L.G., Garcia, A.C.B. and Simões, J.E. (2025) ‘Polarisation, filter bubbles and radicalisation on YouTube: a systematic literature review’, Int. J. Web Based Communities, Vol. 21, No. 4, pp.324–347.
No comments:
Post a Comment