Did The US TikTok Algorithm Just Change And Did It Add Censorship?

Edited on 1/21/242 to update based on the latest behavior and discourse: “smallest man who ever lived” is now autopopulating correctly in search (as are other man-related phrases). “Banned books” is still not autopopulating (but you can still search for it) and you still can’t search for phrases like “rigged election” (you get a message that it violates the community guidelines).

Content creators are also noticing other changes. @reannedupris said that the audio of her speaking in a post was edited by TikTok (at 2:28 in the video, the word “inauguration” is muted and then for the rest of the video, the audio is out of sync). @helenofnine said that one of her posts was removed for a community guidelines violation because of the word “autistic”. (She is autistic and creates content about autism.) A lot of people have noticed that for some posts (about certain kinds of topics) can’t be shared to multiple people at once (try and you’ll get a message “sharing is limited to one chat at a time. This is to help limit the spread of potentially harmful content”). Or displays a message “Consider the accuracy of what you’re sharing“.

TIkTok has always been a little aggressive with community guidelines violations but a lot of this seems new to people. @dinabny speculated that maybe part of what happened during the downtime was the implementation of AI moderation tools. That would explain some of the suppressed search autopopulation that is in some cases being adjusted (smallest man) and in other cases not adjusted (banned books).

So is this just a regular update with some overzealous moderation algorithm? Intentional censoring based on direction from the US government? It’s easy to cherry pick individual happenstance events and turn them into a structured narrative (for instance, thinking something nefarious was happening with “the smallest man who ever lived” when actually any phrase with the word “man” was affected), so we’ll have to wait for more data to better understand what’s happening.

And now to the original post.

I know. No posts since 2017 and I’m back to talk algorithms again.

TikTok was restored Sunday after going dark Saturday night and joy turned to suspicion when things didn’t seem to be the way everyone left them.

The most obvious differences people noticed were that their FYPs were suddenly filled with people they had never seen before and some searches and hashtags didn’t autopopulate as expected and instead seemed to transform into entirely different words.

Did TikTok actually shut down to add censorship (or to change ownership)? Or does the shutdown mean that the algorithm has to learn everything all over again?

I’m not a TikTok engineer but I did live through the Googlebombing era as a Google spokesperson and have watched Google’s handling over autocomplete evolve over the years to try to algorithmically prevent populating suggested searches with hate or violence-enciting speech. (Googlebombing is an old timey activity when lots of people would band together to do a high volume of searches for a person’s name + something unpleasant so that Google would start suggesting the name + the unpleasant thing together.)

Even before the sus FYPs, I suspected that the TikTok brief shutdown was mostly to do a server migration. That TikTok was waiting for the Supreme Court ruling and once that was in, they started to prepare for the possible sale. In order to sell just the US part of their code (and even to allow the deep dives needed for the due diligence process related to a possible sale), they’d need to further separate the US version of TikTok from the infrastructure the rest of the world uses.

They’ve previously said they won’t sell the TikTok algorithm and that could still be true (since that’s core intellectual property that they use for non-US users as well). They may have developed a separate algorithm bundle with a US sale.

If all they did was shut things down and then turn them on again, it doesn’t really make sense that the algorithm would have to re-learn autocomplete and everyone’s FYPs. That doesn’t make sense even if they migrated the data to different servers. But it would make sense if the algorithm is different.

As far as censorship or suggesting different searches, that could be due to some adjustments made that are similar to what Google has done to prevent Googlebombing and hate speech from autopopulating.

Take for instance one of the most popular suspect searches on SwiftTok, “the smallest man who ever lived”. That’s a Taylor Swift song that happens to be the trending audio today on SwiftTok. But you can no longer search for it. When you try, TikTok suggests either “smallest woman” or “smallest men”. At first I thought the “smallest woman” suggestion could be due to TikTok’s version of Googlebombing. But once I dug in a little deeper, it seems instead that TikTok is not allowing any search for “any word” + “man”. It will allow “man” as the first word. But if you include any descriptor before “man”, it replaces it with “men” or suggests a different search entirely.

If I try to search for “fastest man”, that gets replaced with “fastest guy” or “fasted human”. If I search for “blue man”, the suggestions are “blue men” and “blue guy”.

tiktok censoring the smallest man who ever lived

If I search for “the man” (also a Taylor Swift song), I get some suggestions that start with “man”. Which also happens when I search for “the man in” or “the man in the mirror”. But when I search for “man” (as the first word”, I see searches that populate with “man”.

tiktok censoring searches

So it seems like something is generally wonky with how TikTok is handling searches that include the word “man”. I’m not sure if this is on purpose to try to limit hate speech in some way and it’s way too broad or some synonym matching gone wrong or something else.

It could be something similar (with rogue synonym matching) is happening another search that’s a hot topic of discussion: “banned books”. Although that behavior is a little different so it’s possible that the goal here was to temper the algorithmic signals around the word “banned” since that’s all anyone was talking about (posting/commenting/liking) on Saturday related to TikTok being banned. The idea may have been to ensure everyone’s FYPs weren’t full of content from Saturday. (It could be that I’m being too generous and something more nefarious is going on.)

In any case, it’s true that when you search for “banned books”, TikTok’s search suggestions replace the word “banned” with “banner”. However, searches for “book ban” are autopopulating, so the replacement seems to only be happening with “banned” and not “ban”.

Note that for these searches (and the smallest man who ever lived searches), you can manually type in the search and get relevant results. That’s different than a third example in discussion “rigged election”. For that search, a message displays that “this phrase may be associated with behavior or content that violates our guidelines”. I don’t know if that is new or that was happening before the shut down.

Maybe these changes are requests from the government, maybe they’re pieces of code in this potential different algorithm, or maybe they’re something else. I think we’re right to be suspicious. (Honestly, TikTok has always had a bit of censorship built into it.) But sometimes the answer is wonky code.

« « How Mainstream News Sites Are Making Google Less Trustworthy (And How They Can Help Improve the Accuracy and Credibility of Google Search Results)

Leave a Reply

Your email address will not be published. Required fields are marked *

  • What I’m Working On Now

    SEO metrics that matter and actionable insights for improving qualified traffic from unpaid search.

    Google webmaster tools + web analytics come together for a comprehensive view of how your site performs in Google unpaid search. Uncover "not provided" data, better understand your audiences, and monitor SEO and engagement metrics.

    Learn More

  • The BookThe BookThe Book My book Marketing in the Age of Google, has been called the “search marketing bible” and "first must-read-to-survive business book of the twenty-first century". Buy your copy today.
  • About Vanessa Fox

    Vanessa Fox
    I write and speak about the search engine industry and searcher behavior and help companies with online strategy and audience engagement. I'm fascinated by our searching culture and how it's shifted the way we seek out and consume information.

    In 2010, I wrote Marketing in the Age of Google, which I updated and released as a second edition in 2012.

    In 2008, I founded Nine By Blue and Blueprint Search Analytics, which I sold in 2013.

    I spent 2013 traveling the United States in an RV, working from a different city every day.

    Before all of that, I worked at Google, where I built Webmaster Central and helped launch sitemaps.org.

    Now I'm CEO of Keylime Toolbox, software that generates online performance insights from Google Webmaster Tools, web analytics, and server logs for organizations of all sizes.

    I also provide strategic and technical SEO consulting for organizations of all sizes.

  • Girl Meets Road

    In 2013, I worked from a different city (or truck stop or campground) every day, traveling the country in my Roadtrek 170 and documented it all at Girl Meets Road.
  • Archives

  • Categories

  • Mastadon