About a month ago, I wrote about a viral book of “Lost” herbal remedies that had, at the time, sold 60,000 copies on the TikTok Shop despite appearing to violate some of the app’s policies on health misinformation. The book’s sales were boosted by popular videos from wellness influencers on the app, some of which had millions of views, who claimed inaccurately that the once obscure 2019 book contained natural cures for cancer and other ailments.
The influencers, along with TikTok, made money off the sale of this misleading book. I brought all this to the attention of TikTok. The videos I flagged to a company spokesperson were removed after a review for violating TikTok’s policies banning health misinformation.
The book remained for sale in the shop, and new influencers stepped in. Nonetheless, I haven’t stopped seeing TikTok Shop promotions for this book, The Lost Book of Herbal Remedies, since.
“This right here is the reason they’re trying to ban this book,” said one TikTok Shop seller’s video, as he pointed to the book’s list of herbal cancer treatments. Later, he urged his viewers to click through on a link to the Shop listing and buy right away because “it probably won’t be around forever because of what’s inside.”
The video got more than 2 million views in two days. Click through the link as instructed and you’ll see that sales for the book have doubled since my article came out. The Lost Book of Herbal Remedies has sold more than 125,000 copies through the TikTok Shop’s e-commerce platform on TikTok alone. The book’s popularity doesn’t stop there, though: as of June 5, it is the No. 6 bestselling book on Amazon and has been on Amazon’s bestseller list for seven weeks and counting.
The “Invisible Rulers” of online attention
I was thinking about my experience digging into the The Lost Book of Herbal Remedies while reading the forthcoming book Invisible Rulers, by Stanford Internet Observatory researcher Renee DiResta. The book examines and contextualizes how bad information and “bespoke realities” became so powerful and prominent online. She charts how the “collision of the rumor mill and the propaganda machine” on social media helped to form a trinity of influencer, algorithm, and crowd that work symbiotically to catapult pseudo-events, Twitter Main Characters, and conspiracy theories that have captured attention and shattered consensus and trust.
DiResta’s book is part history, part analysis, and part memoir, as it spans from pre-internet examinations of the psychology of rumor and propaganda to the biggest moments of online conspiracy and harassment from the social media era. In the end, DiResta applies what she’s learned in a decade of closely researching online disinformation, manipulation, and abuse, to her personal experience of being the target of a series of baseless accusations that, despite their lack of evidence, prompted Rep. Jim Jordan, as chair of the House subcommittee on Weaponization of the Federal Government, to launch an investigation.
There’s a really understandable instinct that, I think, a lot of people have when they read about online misinformation or disinformation: They want to know why it’s happening and who is to blame, and they want that answer to be easy. Hence, meme-ified arguments about “Russian bots” causing Trump to win the presidential election in 2016. Or, perhaps, pushes to deplatform one person who went viral by saying something wrong and harmful. Or the belief that we can content-moderate our way out of online harms altogether.
DiResta’s book explains why these approaches will always fall short. Blaming the “algorithm” for a dangerous viral trend might feel satisfying, but the algorithm has never worked without human choice. As DiResta writes, “virality is a collective behavior.” Algorithms can surface and nudge and entangle, but they need user data to do it effectively.
Parables, panics, and prevention
Writing about individual viral rumors, conspiracy theories, and products can sometimes feel like telling parables: The Lost Book of Herbal Remedies becomes instructive on the ability of anything to become a TikTok Shop bestseller, so long as the influencers pushing the product are good enough at it.
Most of these parables in the misinformation space do not have neat or happy endings. Disinformation reporter Ali Breland, in his final piece for Mother Jones, wrote about how QAnon became “everything.” To do so, Breland begins with the parable of Wayfair, the cheap furniture seller that became the center of a moral panic about pedophiles.
This moment in online panic history, which also features heavily in DiResta’s book, happened in the summer of 2020, after many QAnon influencers and activity hubs had been banned from mainstream social media (which, incidentally, I interviewed DiResta about at the time for a piece questioning whether such a move happened too late to have any meaningful effect on QAnon’s influence).
Here’s what happened: Somebody online noticed that Wayfair was selling expensive cabinets. The cabinets had feminine names. The person drew some mental dots and connected them: surely, these listings must be coded evidence of a child trafficking ring. The idea caught fire in QAnon spaces and quickly spread beyond the paranoia enclaves. The wild and debunked idea co-opted a real hashtag used to raise awareness about actual human trafficking, which interfered with real investigations.
Breland, in his Mother Jones piece, tracks how the central tenets of the QAnon conspiracy theory stretched way beyond its believers and stayed there. Now, “[W]e are in an era of obsessive, odd, and sprawling fear of pedophilia—one where QAnon’s paranoid thinking is no longer bound to the political fringes of middle-aged posters and boomers terminally lost in the cyber world,” he wrote.
The Wayfair moral panic didn’t become a trend simply because of bad algorithms; it was evidence that the attention QAnon had grabbed previously had worked. Ban its hashtags and its influencers, but the crowd remained, and we were, to some degree, in it.
The Lost Book of Herbal Remedies became a bestseller by flowing through some well-worn grooves. The influencers promoting it knew what they could and couldn’t say from a moderation standpoint, and when those who broke the rules were removed, new influencers stepped up to earn those commissions. My article, and my efforts to bring this trend to the attention of TikTok, didn’t really do anything to slow the demand for this inaccurate book. So, what would work?
DiResta’s ideas for this echo conversations that have been happening among misinformation experts for some time. There are some things platforms absolutely should be doing from a moderation standpoint, like removing automated trending topics, introducing friction to engaging with some online content, and generally giving users more control over what they see in their feeds and from their communities. DiResta also notes the importance of education and prebunking, which is a more preventative version of addressing false information that focuses on the tactics and tropes of online manipulation. Also, transparency.
Would people be more likely to believe that there’s not a vast conspiracy to censor conservatives on social media if there was a public database of moderation actions from platforms? Would people be less enthusiastic to buy a book of questionable natural cures if they knew more about the commissions earned by the influencers promoting it? I don’t know. Maybe!
I do know this, though: After a decade of covering online culture and information manipulation, I don’t think I’ve ever seen things as bad as they are now. It’s worth trying, at least, something.