The Wall Street Journal created dozens of automated accounts that watched hundreds of thousands of videos to reveal how the the TikTok algorithm knows you so well.
A Wall Street Journal investigation found that TikTok only needs one important piece of information to figure out what you want: the amount of time you linger over a piece of content. Every second you hesitate or rewatch, the app is tracking you.
Photo illustration: Laura Kammermann/The Wall Street Journal
WSJ video investigations use visual evidence to reveal the truth behind the most important stories of the day.
Inside TikTok’s Highly Secretive Algorithm
This WSJ video investigation reveals how the video-centric social network is so good at figuring out interests you never expressly tell it.
More from the Wall Street Journal:
Visit WSJ.com: http://www.wsj.com
Visit the WSJ Video Center: https://wsj.com/video
On Facebook: https://www.facebook.com/pg/wsj/videos/
On Twitter: https://twitter.com/WSJ
On Snapchat: https://on.wsj.com/2ratjSM
#WSJ #TikTok #Algorithm
- Okay, has anyone else
noticed that your For You page has been a little too accurate lately? It hasn't been things that I'll google or I talk about. It's been thoughts. - TikTok knows everything about us. ♪ Hold up, don't scroll ♪ ♪ Lemme ask you something first ♪ ♪ Can someone please explain ♪ ♪ How this algorithm works ♪ - [Narrator] TikTok users often wonder how the world's fastest-growing
social network seems to know them so well. - Is TikTok secretly listening to us while we're watching vi
deos? I don't know. (upbeat music) - [Narrator] The answer to
how this app gets to know you so intimately is a highly
secretive algorithm, long guarded by TikTok's China-based parent company, ByteDance. - TikTok has been so successful in terms of implementing their algorithms. - TikTok's algorithm could
influence the thinking of US youth. (upbeat music) - [Narrator] To understand
how it knows users so well, The Wall Street Journal created over 100 automated TikTok accounts or bots that watched h
undreds of
thousands of videos on the app. We also spoke to current
and former executives at the company. Officially, the company says that shares, likes, follows and what
you watch all play a role in what TikTok shows you. We found that TikTok only needs one of these to figure you out. How long you linger
over a piece of content. Every second you hesitate or rewatch, the app is tracking you. - I just wanna quiet the noise. - [Narrator] Through
this one powerful signal, TikTok learns your most
h
idden interests and emotions. And drives you deep into
rabbit holes of content that are hard to escape. (noise of voices) ♪ I'm happy, happy guy ♪ ♪ Oh, just a happy, happy, happy ♪ - [Narrator] The TikTok experience starts the same way for everyone. Open the app and you'll
immediately see an endless string of videos in your For You feed. Take this new user, a 24-year-old from Henry County, Kentucky. (jumbled music plays) TikTok starts by serving
the account a selection of very popular videos ve
tted by app moderators. Is this person religious? - Because I still have a purpose. And you still hold a plan for my life. - [Narrator] Do they wanna
participate in viral dances. (upbeat music) ♪ Hey, hey, who ♪ (upbeat music) - [Narrator] Are they feeling down lately? - [Narrator] Just
remember, I loved you once. And that love goes for a friend, family, or any relationship. - [Narrator] What TikTok doesn't know is that the 24-year-old from Kentucky isn't a person at all. It's one of the bot acc
ounts programmed by The Wall Street Journal. Let's call it kentucky_96. We set up these accounts to understand how TikTok figures out
your unexpressed interests. We assigned each bot a date
of birth and an IP address, which told TikTok their location. None were given a gender. We gave each bot or user interests but those interests were
never entered into the app. The only way our users
expressed their interests was by rewatching or pausing on videos with related hashtags or images. Some were int
o extreme sports. Others were interested in forestry. Or dance. Or astrology. - I'm not the babysitter,
I'm not the parent. - [Narrator] Or some other topic. - [TikToker] Keep scrolling
if you hate animals. - [Narrator] For all our accounts, we found that TikTok
draws users in at first by serving a wide variety of videos. Many with millions of views. Then as the algorithm
sees what you respond to, the selection of videos
and the view counts can get lower and lower with fewer of them vetted by mo
derators to see if they violate
TikTok's terms of service. We reviewed our experiment and its results with a data scientist, an algorithm expert, Guillaume Chaslot, a former Google engineer who worked on YouTube's algorithm. He's now an advocate for
algorithm transparency. The says TikTok is different from other social media platforms. - The algorithm on TikTok
can get much more powerful and it can be able to learn your vulnerabilities much faster. - [Narrator] In fact,
TikTok fully learned many
of our accounts' interests
in less than two hours. Some it figured out in
less than 40 minutes. - [Guillaume] On YouTube, more
than 70% of the views come from the recommendation engine. So it's already huge. But on TikTok, it's even worse. It's probably like 90-95% of the content that is seen that comes from the recommendation engine. (dramatic music) - [Narrator] This is a visualization made from hashtags attached to
the videos our bots watched. Think of it as a partial view of the universe of
TikTok content. Here's where we found dance videos, over here are the cooking videos. The spindly arm stretching
out of the center represent niche content areas. This arm starts with general
videos of cute animals but if we follow it out to the end, we find more specific
videos for enthusiast of French bulldogs. As kentucky_96 starts its journey, it starts moving around
within the mainstream where TikTok is trying to
puzzle out what it wants. We programmed kentucky_96 to be interested in
sadnes
s and depression. Let's see how long it takes
TikTok to figure that out. - [TikToker] Life doesn't happen to you, life happens for you. So if life is taking
people away from your life, and putting new ones in. - [Narrator] Less than three
minutes into using TikTok, at its 15th video, kentucky_96 pauses on this. - [TikToker] And that
love goes for a friend, family or any relationship. - [Narrator] Kentucky_96 watches
the 35-second video twice. Here TikTok gets its first inkling that perhaps the n
ew user
is feeling down lately. - [TikToker] Whoever comes, let them come. Whoever stays, let them stay. Whoever goes, let them go. - [Narrator] The information
contained in this single video provided the app with important clues. The author of the video, the audio track, the video description, the hashtags. After kentucky_96's first sad video, TikTok serves another one 23 videos later. Or after about four more
minutes of watching. - [TikToker] I'll leave
you alone from now on if that's what you
want. - [Narrator] This one is a breakup video with the hashtag #sad. - [TikToker] Do you why I leave you alone? 'Cause I care about your
feelings more than mine. - [Narrator] TikTok's still
trying to suss out this new user with more high view count videos. Does the new user wanna watch
videos about friendship? (upbeat music) (playful music) Or to laugh at funny fail videos? ♪ Oh no, oh no, no, no, no, no ♪ - [TikToker] Nobody's gonna know. - [TikToker] They're gonna know. (dramatic music) - [N
arrator] Or do they like
videos about home repairs? (dramatic music) (upbeat music) Other information from your phone, including location can impact the videos that are shown in a user's feed. - And as a Kentuckian, I never thought I'd lose my freedom over a virus. - [Narrator] For instance,
kentucky_96 saw lots about Kentucky but whether
or not it keeps showing you that type of video depends
on your response to it. A TikTok spokeswoman said the app does not listen to your microphone or read tex
t messages to
serve you personalized videos. (pensive music) At video 57, kentucky_96
keeps watching a video about heartbreak and hurt feelings. (pensive music) And then at video 60, watches one about emotional pain. (dramatic music) Based on the videos we watched so far, TikTok thinks that maybe this user wants to see more about love,
breakups and dating. So at about 80 videos and 15 minutes in, the app starts serving
more about relationships. But kentucky_96 isn't interested. - [TikToker] Your
voice, your smile, your eyes, your laugh. - [Narrator] The user
instead pauses on one about mental health. Then quickly swipes past
videos about missing an ex. - [TikToker] I miss you and
I like having you around. - [Narrator] Advice about moving on and how to hold a lover's interest. - [TikToker] He spends
more time on his phone when he's around you. - [Narrator] Kentucky_96
lingers over this video containing the hashtag #depression. - [TikToker] Something's wrong with me. (sad music) And thes
e videos about
suffering from anxiety. - [TikToker] It's like a reward. - [Narrator] After 224 videos
into the bot's overall journey or about 36 minutes of total watch time, TikTok's understanding of
kentucky_96 takes shape. Videos about depression
and mental health struggles outnumber those about
relationships and breakups. From here on, kentucky_96's
feed is a deluge of depressive content. 93% of videos shown to the account are about sadness or depression. - People been lookin' at me. I'm just
like what you looking at? - [Narrator] A TikTok spokeswoman said that some of the remaining 7% of videos are to help the user
discover different content. But for kentucky_96, such
videos were few and far between. The majority of videos
it was shown outside of its depressive rabbit hole were ads. A TikTok spokeswoman said
that the simulated activity generated by The Wall
Street Journal's bots is not representative
of real user behavior because humans have a
diverse set of interests. But even som
e of our accounts with diverse interests rabbit holed. We showed our data and
many of the videos seen by kentucky_96 to Chaslot. - What we see on TikTok is a bit the same that
what we saw on YouTube. So basically, the algorithm is detecting that this depressing content is useful to create engagement and pushes depressing content. So the algorithm is pushing people towards more and more extreme content, so it can push them toward
more and more watch time. - [Narrator] TikTok
also says it allows y
ou to see less of something by selecting the Not Interested button but Chaslot says that's not enough. - The algorithm is able to find the piece of content that you're vulnerable to, that will make you click, that will make you watch but it doesn't mean you really like it and that it's the content
that you enjoy the most. It's just the content
that's the most likely to make you stay on the platform. - [Narrator] Our bots
only escaped rabbit holes when we changed their viewing interests. When we
told one bot
to stop watching videos about ADHD, the algorithm
cut back on that content. Still, many of The Journal's bots were rapidly pushed
deep into rabbit holes. TikTok learned our bots'
most far-flung interests, like astrology, but even bots with general mainstream
interests got pushed to the margin as the recommendations got more personalized and narrow. Bots with a interest in sexual content wound up way out here watching hashtag #kingtok videos about sexual power dynamics. And our bot w
ith a general
interest in politics wound up being served videos about election conspiracies and QAnon. - Impeach Biden and Kamala. It goes from the House to the Senate. - [Narrator] Deep in the
niche worlds of TikTok, users are more likely to encounter potentiality harmful content that is less vetted by moderators and violates the app's terms of service. - [TikToker] Make them angry and sad. They would be so much happier without you. - [Narrator] A TikTok spokeswoman said that the company catche
s
a lot of banned content, which passes through
both computer analysis and human moderators. It also reviews videos reported by users. - [TikToker] Don't tell
them goodbye, just go. - [TikToker] It's not my time. Pew. (TikToker laughing) - [Narrator] There's a lot of fun, silly and life-affirming content on TikTok. (TikToker screaming) But while TikTok can draw
out what makes you laugh. (TikToker humming) It can also make you
wallow in your darkest box. - Turning the pain from
mental to physical
works. - [Narrator] Without ever
needing to eavesdrop on you or collect any personal
information about you. - I've never attempted
suicide or anything. I've never let it get that far. - Whether it's on TikTok, on Facebook, on YouTube, we're interacting with algorithms in our everyday life more and more. We are training them
and they're training us. So we have to study this so we understand it better and we don't let it go in directions that are harmful to society or to certain groups of people.
Comments
“We created 100 bots” is a very creative way to describe 10 summer interns.
"We're training them and they are training us" well said.
The youtube algorithm is as powerful as tiktok's in landing me on this WSJ report.
Apart from the content itself, the visual and graphics of this video are incredible
“We made a bot who is 24. He is depressed.” Haha yeah he’s still kinda gen-z
Man I feel depressed watching Kentucky swipe
I believe not enough people are discussing the dangers of algorithms that have been tasked with “increasing engagement”
This is super interesting but I would be curious to know how the “automated bots” were programmed to identify and interact with certain types of content. I’d imagine that if these bots were only looking at hashtags for example to identify a specific kind of content, this could bias the results.
Sounds like any other algorithm
YouTube’s algorithm didn’t work well. It always recommends me videos that I don’t want to watch.
One thing I like about youtube is that when I use a vpn, it changes a lot of recommended content. Having lived in different countries and speaking different languages, that is a way to stop it from pushing only one language material on me. Now as far as tiktok goes, it's algorithm is a sticky demon, if it learns something about you, it will almost never let it go
We definitely need to see a trend of having algorithms aid the users rather than algorithms dictating the users. YouTube and Instagram already has me addicted deeply in their platforms that it could be my fulltime job to use their platforms while rationally I would never choose that.
YouTube recommending this video like we don't know they're doing the exact same thing!!!!!!
In the past our personalities were shaped by the people we met, now personalities are shaped by algorithms
And by watching this video till the end, youtube will give more video from WSJ (and topic about tiktok)
The tik tok algorithm never seemed to figure out what I like... even after weeks. It would show me some basic things like "foraging videos" which are kind of interesting to me, but nothing that really hit me and made me feel "at home" which I wager is what it's trying to do. At some point it started showing me animals getting abused??? Which is what made me delete it and not want to go back.
I hate how the conversation has mostly been around TikTok when it comes to algorithms and the boxes social media puts us in, TikTok only has the best filters for finding interest but social media companies have influenced us way more than we realize by pushing everyone to the extremes of their interest. While they build these systems to make the most amount of money people are push more to content that sometimes has no basis in actual reality and when that is the only content you are seeing its not hard to see how that can seem exactly like reality to the user.
I'm kind of impressed by WSJ's technical capabilities. Right on.
I personally feel that these algorithms should be designed to challenge the views and opinions of people, not further confirm their biases.
editing level is on other level...🔥