There are times when companies try to address problems without examining in full the broader implications of its moves. TikTok has joined that list.

A report from Netzpolitik surfaced this week about TikTok hiding from feeds videos from people with disabilities, those with "facial disfigurement," overweight users and others. Netzpolitik obtained screenshots of TikTok's rules and the designations in which they'd fall for moderators to decide whether to allow videos to go anywhere and everywhere on the social network or to be limited in scope.

One particularly toubling category, called Auto R, placed what TikTok called "special users," including those with specific disabilities, facial disfigurement, and "fat and self-confident users," into one group. Moderators would then stop videos from Auto R users from entering the 'For You' category that allows many videos on TikTok to go viral.

TikTok was clear in its response to the Netzpolitik investigation and inquiries from Netzpolitik that it was trying to act in its users self-interests. TikTok creator ByteDance said that the moves were aimed at limiting bullying on the platform and stopping certain users from being negatively targeted.

"Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy," ByteDance told Netzpolitik. "While the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections."

In a bid to do right by some of its users, TikTok actually hurt them. The service forced moderators to determine whether someone has a disability or may be subject to bullying and decide whether to allow their videos to be seen by all the other users or be limited in their scope. It also meant that those who were perhaps using TikTok to express themselves and share themselves with the public weren't able to.

I can appreciate that TikTok needs to do something to address bullying. And it's undoubtedly a problem for people using any service, let alone TikTok. But in a bid to handle a problem endemic to social media, TikTok failed to consider what such censorship would mean to the very people it's trying to protect. 

If that sounds familiar, it's because Twitter recently dealt with a similar problem after the company announced plans to remove unused accounts and allow others to employ previously claimed usernames. Twitter ultimately backed off the policy after those with loved ones who had died said they still wanted to keep their loved ones' accounts to preserve their memories.

"We've heard you on the impact that this would have on the accounts of the deceased," Twitter said in a tweet. "This was a miss on our part. We will not be removing any inactive accounts until we create a new way for people to memorialize accounts."

For its part, TikTok said that the original video removal was an early step and it's since moved on to other policies to limit bullying. The company didn't elaborate, however, so it's difficult to say how that might have impacted its disabled users.

If nothing else, it's another example of tech companies needing to think more deeply and needing to take a detailed look at issues and find real solutions that empower those who may be bullied for disabilities and not limit their reach.

Published on: Dec 3, 2019
The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.