Introduction
The 2024 election has highlighted a substantial shift in the way that political information is shared. Gen-Z TikTok users seem to rely extensively on the app for political content, with over 48% of TikTok users in this study stating that they use TikTok to stay informed on news, and many left-wing and right-wing accounts are leveraging TikTok’s reach to engage young voters (Beshay, 2024). Presidential campaigns such as KamalaHQ and individual accounts such as Brett Cooper have relied on TikTok to appeal to young voters, while TikTok has become a “battleground” (Lynch, 2024) between right-wing and left-wing campaigns. With this shift towards TikTok as a primary source of information, traditional sources such as news articles and academic research are being phased out as primary sources, leading to questions about the credibility of data. While many Gen-Zers have critiqued traditional news sources for their censorship on global conflicts, social media as news also raises questions on the reliability of short-form social media content (Odejimi, 2024), particularly via TikTok.
Chapter One of Safiya Noble’s Algorithms of Oppression similarly suggests that users may not always trust the most reliable content, as users often believe that the top posts are “either the most popular or the most credible or both” (Noble 2018 page 32). A question of the credibility of digital information, coupled with how social media algorithms may promote certain views, led our group to be interested in exploring the frequency of left-wing, neutral, and right-wing media on TikTok. We particularly chose to focus on the way that traditional news outlets may promote left or right-wing content, and what different levels of engagement with this post may mean for the presence of left-wing and right-wing content on TikTok. Through this, we want to explore the biases that may be present through political polarization on TikTok and what the type of political content that performs well on TikTok may say about algorithmic political biases on the platform.
Digital platform auditing allows us to continuously review and reflect on the biases present in the platforms we use. Given the impact digital platforms can have on political attitudes and views, our TikTok auditing objective hopes to question the reliance of social media accounts on political views. This auditing process is significant in helping us understand the biases present in the algorithms we use and how these biases may contribute to the endorsement of misinformation or increasingly polarizing views.