Welcome to my For You page, where chaos reigns.
For You: The worst things that have ever happened to everyone else TikTokâs all-powerful, all-knowing algorithm appears to have decided that I want to see some of the most depressing and disturbing content the platform has to offer. My timeline has become an endless doomscroll. Despite TikTokâs claims that its mission is to âbring joy,â I am not getting much joy at all. What I am getting is a glimpse at just how aggressive TikTok is when it comes to deciding what content it thinks users want to see and pushing it on them. Itâs a bummer for me, but potentially harmful to users whose timelines become filled with triggering or extremist content or misinformation. This is a problem with pretty much every [social]( [media]( [platform]( as well as [YouTube](. But with TikTok, it feels even worse. The platformâs algorithm-centric design sucks users into that content in ways its rivals simply donât. And those users tend to skew younger and [spend more time on TikTok]( than they do anywhere else. To give you a sense of what Iâm working with here, my For You page â thatâs TikTokâs front door, a personalized stream of videos based on what its algorithm thinks youâll like â is full of peopleâs stories about the worst thing that has ever happened to them. Sometimes they talk to the camera themselves, sometimes they rely on text overlays to tell the story for them while they dance, sometimes itâs photos or videos of them or a loved one injured and in the hospital, and sometimes itâs footage from Ring cameras that show people accidentally running over their own dog. Dead parents, dead children, dead pets, domestic violence, sexual assault, suicides, murders, electrocutions, illnesses, overdoses â if itâs terrible and someone has a personal story to tell about it, itâs probably in my For You feed. I have somehow fallen into a rabbit hole, and it is full of rabbits that died before their time. The videos often have that [distinctive]( [TikTok]( [style]( that adds a layer of surrealness to the whole thing, often with the [latest music meme](. Videos are edited so that Bailey Zimmerman sings âthatâs when I lost itâ at the exact moment a woman reacts to finding out her mother is dead. Tears run down flawless, radiant, beauty-filtered cheeks. Liberal use of TikTokâs text-to-speech feature means a cheerful robot-y womanâs voice might be narrating the action. â[Algospeak](â â code words meant to get around TikTokâs moderation of certain topics or keywords â tells us that a boyfriend âunalivedâ himself or that a father â$eggsually a[B emoji]usedâ his daughter. Oh, I also get a lot of ads for mental health services, which makes sense considering the kind of person TikTok seems to think I am. TikTok is designed to suck you in and keep you there, starting with its For You page. The app opens automatically to it, and the videos autoplay. Thereâs no way to open to the feed of accounts you follow or to disable the autoplay. You have to opt out of watching what TikTok wants you to see. âThe algorithm is taking advantage of a vulnerability of the human psyche, which is curiosity,â Emily Dreyfuss, a journalist at the Harvard Kennedy School's Shorenstein Center and co-author of the book Meme Wars, told me. Watchtime is believed to be a [major factor]( when it comes to what TikTok decides to show you more of. When you watch one of the videos it sends you, TikTok assumes youâre curious enough about the subject to watch similar content and feeds it to you. Itâs not about what you want to see, itâs about what youâll watch. Those arenât always the same thing, but as long as it keeps you on the app, that doesnât really matter. That ability to figure out who its users are and then target content to them based on those assumptions is a major part of TikTokâs appeal. The algorithm knows you better than you know yourself, some say. One reporter [credited]( TikTokâs algorithm with knowing she was bisexual before she did, and sheâs [not the only person]( to do so. I thought I didnât like what TikTok was showing me, but I had to wonder if perhaps the algorithm picked up on something in my subconscious I didnât know was there, something that really wants to observe other peopleâs misery. I donât think this is true, but I am a journalist, so ... maybe? Iâm not the only TikTok user who is concerned about what TikTokâs algorithm thinks of them. According to a [recent study]( of TikTok users and their relationship with the platformâs algorithm, most TikTok users are very aware that the algorithm exists and the significant role it plays in their experience on the platform. Some try to create a certain version of themselves for it, what the studyâs authors call an âalgorithmized self.â Itâs like how, on other social media sites, people try to present themselves in a certain way to the people who follow them. Itâs just that on TikTok, theyâre doing it for the algorithm. Aparajita Bhandari, the studyâs co-author, told me that many of the users she spoke to would like or comment on certain videos in order to tell the algorithm that they were interested in them and get more of the same. âThey had these interesting theories about how they thought the algorithm worked and how they could influence it,â Bhandari said. âThereâs this feeling that itâs like youâre interacting with yourself.â In fairness to TikTok and my algorithmized self, I havenât given the platform much to go on. My account is private, I have no followers, and I only follow a handful of accounts. I donât like or comment on videos, and I donât post my own. I have no idea how or why TikTok decided I wanted to spectate other peopleâs tragedies, but Iâve definitely told it that I will continue to do so because Iâve watched several of them. Theyâre right there, after all, and Iâm not above rubbernecking. I guess I rubbernecked too much. Iâll also say that there are valid reasons why some of this content is being uploaded and shared. In some of these videos, the intent is clearly to spread awareness and help others, or to share their story with a community they hope will be understanding and supportive. And some people just want to meme tragedy because I guess we all heal in our own way. This made me wonder what this algorithm-centric platform is doing to people who may be harmed by falling down the rabbit holes their For You pages all but force them down. Iâm talking about teens seeing eating disorder-related content, which the Wall Street Journal recently [reported]( [on](. Or extremist videos, which arenât [all that difficult]( to find and which we know can [play a part]( in [radicalizing viewers]( on platforms that are less addictive than TikTok. Or [misinformation]( about Covid-19 vaccines. âThe actual design choices of TikTok make it exceptionally intimate,â Dreyfuss said. âPeople say they open TikTok, and they don't know what happens in their brain. And then they realize that they've been looking at TikTok for two hours.â TikTok is quickly becoming the app people turn to for more than just entertainment. Gen Z users are apparently [using it as a search engine]( â though the accuracy of the results seems to be an open question. Theyâre also using it as a [news source](, which is potentially problematic for the same reason. TikTok wasnât built to be fact-checked, and its design doesnât lend itself to adding context or accuracy to its usersâ uploads. You donât even get context as simple as the date the video was posted. Youâre often left to try to find additional information in the videoâs comments, which also have no duty to be true. TikTok now says itâs [testing]( [ways]( to ensure that peopleâs For You pages have more diversified content. I recently got a prompt after a video about someoneâs motherâs death from gastric bypass surgery asking how I âfeltâ about what I just saw, which seems to be an opportunity to tell the platform that I donât want to see any more stuff like it. TikTok also has rules about sensitive content. Subjects like suicide and eating disorders can be shared as long as they donât glamorize them, and content that features violent extremism, for instance, is [banned](. There are also moderators hired to keep the really awful stuff from surfacing, sometimes [at the expense of their own mental health](. There are a few things I can do to make my For You page more palatable to me. But they require far more effort than it took to get the content Iâm trying to avoid in the first place. Tapping a videoâs share button and then ânot interestedâ is supposed to help, though I havenât noticed much of a change after doing this many times. I can look for topics I am interested in and watch and engage with those videos or follow their creators, the way the people in Bhandariâs study do. I also uploaded a few videos to my account. That seems to have made a difference. My videos all feature my dog, and I soon began seeing dog-related videos in my feed. This being my feed, though, many of them were tragic, like a dying dachshund's last photoshoot and a warning not to let your dogs eat corn cobs with a video of a man crying and kissing his dog as she prepares for a second surgery to remove the corn cob he fed her. Maybe, over time, the happy dog videos Iâm starting to see creep onto my For You page will outnumber the sad ones. I just have to keep watching. âSara Morrison, senior reporter [Ben Smith at the Code Conference, in 2015.]( Asa Mathat for Recode [Ben Smithâs Semafor is live. He says it will take 10 years to get it right.]( [A Q&A with the co-founder of the news startup.]( [Elon Musk and Twitter]( Muhammed Selim Korkutata/Anadolu Agency/Getty Images [The Elon Musk-Twitter saga]( [Musk keeps changing his mind about whether he wants to buy the social media company. Weâll find out soon if heâll actually reach a deal.]( [A graphic showing a large hourglass hovering over a landscape of homes with solar panels, an electric car, and a slab of meat.]( Nick Little for Vox [Back to the future]( [Rethinking old ideas about what we eat, where we live and work, and how we power our communities.]( [Learn more about RevenueStripe...]( [Illustrations by Ben Denzer for Vox]( [The future of the office is a lab]( [Whatâs going to happen to the office space we no longer need?]( Phil Robibero for Vox [The wasted potential of garbage dumps]( [Toxic landfills are emblems of environmental injustice across the US. Clean energy can remake them.]( [Listen to This] [Listen to the Recode Media]( [Ben Smith talks Semafor]( One week after the launch of Semafor, Recodeâs Peter Kafka flags down the news startupâs co-founder Ben Smith.
Who is Semafor for? How can a news startup based in NYC and Washington solve polarization? What, exactly, is a âSemaformâ and whatâs the point of it? And what New York Times office amenity does Smith miss the most? [Listen on Apple Podcasts.]( [This Is Cool] [The terrifying face of the humble ant]( [Learn more about RevenueStripe...]( [Facebook]( [Twitter]( [YouTube]( This email was sent to {EMAIL}. Manage yourâ¯[email preferences]( orâ¯[unsubscribe](param=recode). View our [Privacy Notice]( and our [Terms of Service](. Vox Media, 1201 Connecticut Ave. NW, Floor 12, Washington, DC 20036. Copyright © 2022. All rights reserved.