
TikTok has become the newest social media app of choice, especially among young people. According to an article written in Business of Apps, 60 per cent of TikTok users in the US are 16-24 years old. But this app is not all funny memes and silly videos. TikTok has a darker side that many users aren’t aware of.
Multiple research studies have shown social networking sites are quickly becoming the main way young people receive their news. It is a common misconception that young users are more tech savvy and less vulnerable to misinformation online. One Stanford University study found 82 per cent of middle-schoolers could not tell the difference between “sponsored content” and a real news story on a website. The study included 7,804 students from middle school through college which determined that younger audiences are not able to judge the credibility of news online.
Most of the popular social media apps have been in circulation for years and have developed strategies and transparency in order to combat misinformation and make their sites safe for users. TikTok however is a fairly new app and does not have the same type of regulation.
What’s wrong with the algorithm?
TikTok, similar to Facebook and Youtube, provides content for their users based on an algorithm. According to TikTok’s website, the algorithm which they refer to as a “recommendation system” collects information and data on the content a person interacts with. This is called the “For You” page. However, the website does not explicitly reveal how and what data the app collects to create their algorithm. TikTok did not respond to requests for comment.
What’s concerning about this app is the amount of misinformation and disinformation that is circulating through people’s feeds. According to Jane Lytvynenko, Buzzfeed’s senior reporter on disinformation, TikTok’s algorithm is dangerous because if people interact with certain hashtags or sounds out of interest, they will now only be served those types of videos on their feed.
Lytvynenko says a lot of false information on the app is video based, which makes it harder to debunk. “Community leaders such as influencers, influential accounts or even something that just goes viral will lend credibility to this disinformation, making it very believable to the audience” she says.
She also says although the misinformation is challenging to fact check, the danger really lies within the social communities that the user belongs to. Influencers (who live up to their title) have a tremendous amount of impact on their audiences, especially if those audiences are young and vulnerable.
Filter bubbles and echo chambers
The types of interactions enabled by TikTok automatically group users into filter bubbles. According to C. Thi Nguyen, an assistant philosophy professor at Utah Valley University, filter bubbles occur when online algorithms isolate a person from perspectives and information that they haven’t already expressed an interest in. Filter bubbles can then lead people into echo chambers. Echo chambers occur when a person is only receiving information on things that reinforce their worldview and beliefs. The key difference between an echo chamber and filter bubble is that someone in an echo chamber has a great mistrust for outside sources.
TikTok makes it very easy for people to become stuck in echo chambers, creating a complete mistrust of outside sources. Even if the user receives accurate information that opposes their views, they will never accept it. In fact, opposing information will only strengthen their echo chamber. A study by Northwestern University psychologist David Rapp, reveals even when we know better, our brains rely on inaccurate or misleading information if it is what we remember. The study finds the reason for this is because people tend to quickly download inaccurate or false statements into memory because it’s easier than critically analyzing what they’ve heard. He says if the information is available, people tend to think they can rely on it, but just because you can remember what someone said, doesn’t make it true.
Young people who look for news on social media can easily fall into echo chambers because the algorithm will produce news that the user has interacted with before. The danger then becomes when people don’t know they’re in an echo chamber. And once you’re in one, it is hard to get out of.
Political Disinformation
16-year-old avid TikTok user, Harsimran Sekhon, says she notices disinformation on the app relating to right-wing politics. “The only disinformation I’ve honestly seen are people that have very strong political views, mostly just Trump supporters who like to spread false news about other groups of people”.
Sekhon says there’s a lot of young Trump supporters on TikTok with thousands of followers who make short videos with racist content. Their content receives millions of views because they have a large following. Although TikTok has stated they have implemented measures to reduce the circulation of racist content, there are still ways to cheat the system. Creators posting racist videos can simply use different language to get their point across. As Jane Lytvynenko said in an interview, videos are extremely difficult to debunk.
During the 2020 U.S presidential election, popular hashtags such as #CrookedBiden, #RiggedElection and #StoptheSteal emerged but were quickly banned from the platform. Unfortunately, there are easy ways to get around this ban by simply changing and removing a letter or using similar words in the hashtag.
The TikTok community guidelines state the company is committed to keeping misleading, harmful, and deceptive content and accounts off the app. In August, a press release stated they were implementing three new measures to combat misinformation and disinformation. The new measures included providing better clarity on what is and isn’t allowed on the app, broadening their fact-checking partnerships and working with the U.S Department of Homeland Security to protect against foreign influence.
Is this enough to keep misinformation off the app?
There is more than one way to combat misinformation on the internet. A report done by the Reboot Foundation outlines key ways misinformation can be prevented and addressed. The most important is media literacy. This requires simple interventions from users, such as reading more on a topic and fact checking to determine if what they are seeing is trustworthy or not. However, we cannot expect every individual to participate in this practice with every piece of information they come across. The Reboot Foundation states the government plays an important role in developing a healthy public media sphere. Without intervening in the realm of state censorship, governments can provide financial resources in schools and businesses to improve media literacy. What is equally important is the responsibility tech companies have to monitor and remove the information on their websites or apps that puts their users at risk.
There are even people on the app who have taken matters into their own hands to debunk misinformation. Laurel Bristow, also known as @kinggutterbaby on TikTok, is a clinical research coordinator at Emory’s vaccine and treatment evaluation unit. When the coronavirus pandemic began producing its fair share of conspiracy theories, critics and deniers, she took to that app to use her expertise to debunk false claims. Her videos with the occasional filter and millennial humour appeal to the younger generations, gaining their trust and confidence in the truth.


Leave a comment