In A Blow to the Disability Community, Tiktok Suppresses Our Voices

Sara Hagen
3 min readJun 16, 2022

--

Think about “high-risk” behaviors. What comes to mind? I think about substance abuse, physical violence, abuse, and self-harm. No matter what side of the censorship debate you are on, all of us can agree that being disabled isn’t a “high-risk” behavior. It isn’t a behavior at all. It is a state of being, an identity.

In a decision to protect its users from bullying, TikTok suppressed disabled and neurodivergent voices. They were deemed too “high risk”. Videos that indicated disfigurement, Autism, and Down Syndrome were tagged and left up to human moderators to make a 30-second (half the length of the 60-second maximum format) decision about whether the creator matched the narrow definition provided to them.

In reality, this so-called protective decision is more discriminatory than anything. Disabled voices already face ableism and reduction daily. That’s without the intervention of TikTok overlords.

I joined the app in mid-2020 and fell in love with it. My favorite part is discovering creators who share experiences about living with a disability, fatness, mental illness, etc. Where representation of these communities is nearly absent in television and film, TikTok has the most intersectional user base I’ve seen online. The problem is that once we are there, our reach is far more limited than that of the users who have risen to celebrity status by posting dance videos.

Over the past few months, I have seen an increasing amount of the disabled community on TikTok rightfully complaining about being “shadowbanned” (a creator’s new content not being shown to their followers or potential followers).

While sites like Facebook and Twitter have content moderation and guidelines, TikTok takes a new approach. They censor nearly everything and sometimes make changes only after a public outcry. Their policy is to censor first and ask questions later.

Are there any laws to protect TikTok? Or hold it accountable?

Enter Section 230 of the Communications Decency Act. Signed into law in 1996, it states that websites aren’t liable for their users’ content. It protects media companies’ freedom to moderate without legal liability.

For the most part, I support this. Section 230 has been instrumental in protecting media companies’ rights to moderate their content how they see fit. It’s not a perfect law. We need to be active in evolving the legislation.

A bipartisan bill called The Platform Accountability and Consumer Transparency (PACT) Act was introduced to Congress in June 2020. This piece of legislation could be the answer to TikTok’s censorship problem. Senator Brian Schatz (D — HI), one of the bills co-sponsors, described PACT’s approach as “…a scalpel, rather than a jackhammer”. This presumably refers to President Joe Biden’s proposal to repeal Section 230 entirely.

PACT aims to add conditions to social media platforms’ liability protections. Transparency is key here. It would require companies to show users data regarding what and why content gets shadowbanned, demonetized, removed, etc. It would also include an option to appeal a moderation decision.

When it comes to TikTok, we are in uncharted waters. It is the fastest-growing social media platform in the United States. It is also a formerly Chinese company. The censorship built into the foundations of the app by the Chinese government is perhaps still a part of the algorithm and, in consequence, the moderation. TikTok has remained quiet on recent accusations of over-censoring content.

The internet is evolving quickly. Faster than legislation can keep up with. We all need to do our part and contact our representatives about the ethics of censorship and, if you support it, moving PACT towards becoming law.

--

--

Sara Hagen

Writer specializing in advocating for marginalized people.