Doom scrolling: TikTok, the algorithm and consent

Content warning: This post includes mentions transphobia, homophobia, far right ideology and sexual assault which some readers might find distressing. If you need to reach out to any help or assistance, please contact 1800 RESPECT.”

Young people are learning a lot from TikTok at the moment. Whether it’s random scientific facts or something political, people are finding content on subjects they are interested in, and where formal schooling falls short.

This can be great – particularly when discussing sexual consent. Sex education beyond “use a condom” will vary between schools, leaving something to be desired particularly on topics such as queer sex, female pleasure and consent. Platforms like TikTok have played a role in filling this gap, with many creators – both professional and the general public – sharing their knowledge and experiences.

However, users should remain wary of both the array of creators, as well as the very way the platform works, and it’s something we should talk about on Safer Internet Day.

Within all of the informative content available on TikTok, there are also many people that create, endorse and share videos that feature misogynistic content including rape apology, justification and sympathy for perpetraitors. When the internet is filling a gap in education, the content can become pretty influential, and could mould and shape the opinions, attitudes and actions of young people. Some of this content can weave into your feed if you’ve shown the vaguest interest in the topic.

TikTok’s algorithm

This is where TikTok itself becomes part of the problem. The algorithm that suggests the “For You’ feed harnesses all of your watch history to best cater to your tastes. While this feature keeps you entertained and on the app for longer, it can also intensify the content that it shows you. 

In a recent study by Media Matters for America, it was found engaging with transphobic content would lead users down a ‘rabbit hole’ to far right wing content in a matter of hours.

“Of the 360 total recommended videos included in our analysis, 103 contained anti-trans and/or homophobic narratives, 42 were misogynistic, 29 contained racist narratives or white supremacist messaging, and 14 endorsed violence.”

The report provides a number of examples of the dangerous content that exists on the platform.

“A user could feasibly download the app at breakfast and be fed overtly white supremacist and neo-Nazi content before lunch.”

This is important to know about consent education on the platform – there are just as many toxic narratives as there are positive ones. This is definitely something all users need to be aware of when having their well deserved scroll at the end of your day. 

So how can we keep the TikTok safe, informative and fun?

The first thing to do is make sure the accounts you follow are aligned with your values so that the algorithm learns the kind of content you want. However, you should also diversify your feed to ensure that you have a range of ideas from different perspectives. Then, if and when, you see anything that exists within rape culture, make sure to report the videos to get it out of your feed, and protect others as well. And if you’re a creator, use your platform effectively!