A new book examines the "invisible rulers" who manipulate your attention online.
Why lying on the internet keeps working About a month ago, I [wrote about]( a viral book of âlostâ herbal remedies that had, at the time, sold 60,000 copies on the TikTok Shop despite appearing to violate some of the appâs policies on health misinformation. The bookâs sales were boosted by popular videos from wellness influencers on the app, some of which had millions of views, who claimed inaccurately that the once obscure 2019 book contained natural cures for cancer and other ailments. The influencers, along with TikTok, made money off the sale of this misleading book. I brought all this to the attention of TikTok. The videos I flagged to a company spokesperson were removed after a review for violating TikTokâs policies banning health misinformation. The book remained for sale in the shop, and new influencers stepped in. Nonetheless, I havenât stopped seeing TikTok Shop promotions for this book, The Lost Book of Herbal Remedies, since. âThis right here is the reason theyâre trying to ban this book,â said one TikTok Shop sellerâs video, as he pointed to the bookâs list of herbal cancer treatments. Later, he urged his viewers to click through on a link to the Shop listing and buy right away because âit probably wonât be around forever because of whatâs inside.â The video got more than 2 million views in two days. Click through the link as instructed and youâll see that sales for the book have doubled since my article came out. The Lost Book of Herbal Remedies has sold more than 125,000 copies through the TikTok Shopâs e-commerce platform on TikTok alone. The bookâs popularity doesnât stop there, though: as of June 5, it is the No. 6 bestselling book on Amazon and has been on Amazonâs bestseller list for seven weeks and counting. The âinvisible rulersâ of online attention I was thinking about my experience digging into the The Lost Book of Herbal Remedies while reading the forthcoming book [Invisible Rulers](, by Stanford Internet Observatory researcher Renee DiResta. The book examines and contextualizes how bad information and â[bespoke realities](â became so powerful and prominent online. She charts how the âcollision of the rumor mill and the propaganda machineâ on social media helped to form a trinity of influencer, algorithm, and crowd that work symbiotically to catapult pseudo-events, Twitter Main Characters, and conspiracy theories that have captured attention and shattered consensus and trust. DiRestaâs book is part history, part analysis, and part memoir, as it spans from pre-internet examinations of the psychology of rumor and propaganda to the biggest moments of online conspiracy and harassment from the social media era. In the end, DiResta applies what sheâs learned in a decade of closely researching online disinformation, manipulation, and abuse, to her personal experience of being the target of a series of baseless accusations that, despite their lack of evidence, prompted Rep. Jim Jordan, as chair of the House subcommittee on Weaponization of the Federal Government, to [launch an investigation](. Thereâs a really understandable instinct that, I think, a lot of people have when they read about online misinformation or disinformation: They want to know why itâs happening and who is to blame, and they want that answer to be easy. Hence, [meme-ified arguments]( about âRussian botsâ causing Trump to win the presidential election in 2016. Or, perhaps, pushes to deplatform one person who went viral by saying something wrong and harmful. Or the belief that we can content-moderate our way out of online harms altogether. DiRestaâs book explains why these approaches will always fall short. Blaming the âalgorithmâ for a dangerous viral trend might feel satisfying, but the algorithm has never worked without human choice. As DiResta writes, âvirality is a collective behavior.â Algorithms can surface and nudge and entangle, but they need user data to do it effectively. Parables, panics, and prevention Writing about individual viral rumors, conspiracy theories, and products can sometimes feel like telling parables: The Lost Book of Herbal Remedies becomes instructive on the ability of anything to become a TikTok Shop bestseller, so long as the influencers pushing the product are good enough at it. Most of these parables in the misinformation space do not have neat or happy endings. Disinformation reporter Ali Breland,[in his final piece for Mother Jones](, wrote about how QAnon became âeverything.â To do so, Breland begins with the parable of Wayfair, the cheap furniture seller that became the center of a moral panic about pedophiles. This moment in online panic history, which also features heavily in DiRestaâs book, happened in the summer of 2020, after many QAnon influencers and activity hubs had been banned from mainstream social media (which, incidentally, I interviewed DiResta about at the time [for a piece questioning](whether such a move happened too late to have any meaningful effect on QAnonâs influence). Hereâs what happened: Somebody online noticed that Wayfair was selling expensive cabinets. The cabinets had feminine names. The person drew some mental dots and connected them: surely, these listings must be coded evidence of a child trafficking ring. The idea caught fire in QAnon spaces and quickly spread [beyond]( the paranoia enclaves. The wild and debunked idea co-opted a real hashtag used to raise awareness about actual human trafficking, which [interfered with real investigations](. Breland, in his Mother Jones piece, tracks how the central tenets of the QAnon conspiracy theory stretched way beyond its believers and stayed there. Now, â[W]e are in an era of obsessive, odd, and sprawling fear of pedophiliaâone where QAnonâs paranoid thinking is no longer bound to the political fringes of middle-aged posters and boomers terminally lost in the cyber world,â he wrote. The Wayfair moral panic didnât become a trend simply because of bad algorithms; it was evidence that the attention QAnon had grabbed previously had worked. Ban its hashtags and its influencers, but the crowd remained, and we were, to some degree, in it. The Lost Book of Herbal Remedies became a bestseller by flowing through some well-worn grooves. The influencers promoting it knew what they could and couldnât say from a moderation standpoint, and when those who broke the rules were removed, new influencers stepped up to earn those commissions. My article, and my efforts to bring this trend to the attention of TikTok, didnât really do anything to slow the demand for this inaccurate book. So, what would work? DiRestaâs ideas for this echo conversations that have been happening among misinformation experts for some time. There are some things platforms absolutely should be doing from a moderation standpoint, like removing automated trending topics, introducing friction to engaging with some online content, and generally giving users more control over what they see in their feeds and from their communities. DiResta also notes the importance of education and prebunking, which is a more preventative version of addressing false information that focuses on the [tactics and tropes]( of online manipulation. Also, transparency. Would people be more likely to believe that thereâs not a vast conspiracy to censor conservatives on social media if there was a public database of moderation actions from platforms? Would people be less enthusiastic to buy a book of questionable natural cures if they knew more about the commissions earned by the influencers promoting it? I donât know. Maybe! I do know this, though: After a decade of covering online culture and information manipulation, I donât think Iâve ever seen things as bad as they are now. Itâs worth trying, at least, something. â[A.W. Ohlheiser](, technology writer [Sam Altman is pictured turning his head and talking.]( David Paul Morris/Bloomberg via Getty Images [OpenAI insiders are demanding a âright to warnâ the publicÂ]( âIâm scared. Iâd be crazy not to be,â one former employee tells Vox. [An illustration of a bright purple wormhole made up of iPhone shapes with hands holding them. Icons such as voter stickers, peace signs, and question marks show up throughout. In the center, an iphone displays a TikTok logo.]( Lina Müller for Vox [Is TikTok breaking young votersâ brains?]( TikTok has made itself into a hub for Gen Z political expression. But as Zoomers change politics, how is the app changing them? [GettyImages-1230575485]( Stanislav Kogiku / SOPA Images / LightRocket via Getty Images [Can you really just sweep your CO2 emissions underground? Japan is about to find out.]( Japan has a vision for exporting its carbon pollution to address climate change. But will it work?
Â
[Learn more about RevenueStripe...]( Hudson Christie for Vox [The internet peaked with âthe dress,â and then it unraveled]( The ominously perfect meme marked the splintering of our shared reality. [An illustration shows eyeballs resting on a block and looking off to the right. A dimensional set of tangled arrows originating from the eyes all point toward the right too.]( Hudson Christie for Vox [10 big things we think will happen in the next 10 years]( Obesity will go down, electric cars will go up, and a nuclear bomb might just fall. Become a Vox member Support our journalism â become a Vox Member and youâll get exclusive access to the newsroom with members-only perks including newsletters, bonus podcasts and videos, and more. [Join our community]( [Listen To This] [Listen to This]( [Immigration lemonade]( When it comes to immigration solutions, the federal government is handing out lemons. Denver Mayor Mike Johnston is making lemonade. [Listen to Apple Podcasts]( [This is cool] //link.recode.net/click/35617911.31312/aHR0cHM6Ly93d3cudGlrdG9rLmNvbS9AdGhlbGFzdGJpcmRiZW5kZXIvdmlkZW8vNzIwNTY2NDgzMDIwMjY3ODU3ND9sYW5nPWVuJnVlaWQ9MDkwODhjM2Y0NTA5ZDYyMGNhNWFkOTVkY2JiNDYyY2I/608c6c3d7e3ba002de96e5e7B1d9d41e1[Uh-oh](
Â
[Learn more about RevenueStripe...]( [Facebook]( [Twitter]( [YouTube]( This email was sent to {EMAIL}. Manage yourâ¯[email preferences]( , orâ¯[unsubscribe](param=tech) â¯to stop receiving emails from Vox Media. View our [Privacy Notice]( and our [Terms of Service](. Vox Media, 1201 Connecticut Ave. NW, Washington, DC 20036. Copyright © 2024. All rights reserved.