Skip to main content

Black Listed News
Trending Articles:
Trending Articles:

Be Careful What You Wish For: TikTok Tries To Stop Bullying On Its Platforms... By Suppressing Those It Thought Might Get Bullied

Published: December 12, 2019
Share | Print This


Be careful what you wish for when you demand that internet platforms police the internet for any and all bad stuff. There was a lot of fuss and cringing when this story broke that part of TikTok's content moderation strategies included suppressing videos by disabled, queer, and fat creators.

Leaked documents reveal how TikTok hid videos of people with disabilities. Queer and fat users were also pushed out of view.

No matter how you look at it, this looks bad. And for good reasons. But, as the company itself claims, it had good intentions behind this, even if the execution was atrocious. There have been tons of reports of bullying on the platform -- and like with so many social problems that are making themselves more widely known thanks to technology, the first reaction of many is to blame the tech platform, and to demand they "fix it."

And, a la the infamous paperclip maximizer thought experiment, what's the most efficient way to stop bullying? Some figured it might be to hide the likely-to-be-bullied rather than the actual bullies:

The relevant section in the moderation rules is called "Imagery depicting a subject highly vulnerable to cyberbullying". In the explanations it says that this covers users who are "susceptible to harassment or cyberbullying based on their physical or mental condition“.

According to the memo, mobbing has negative consequences for those affected. Therefore, videos of such users should always be considered as a risk and their reach on the platform should be limited.

TikTok uses its moderation toolbox to limit the visibility of such users. Moderators were instructed to mark people with disabilities as "Risk 4". This means that a video is only visible in the country where it was uploaded.

And, yes, there is a very reasonable argument that the content moderation team at TikTok/ByteDance should have recognized that this is a horrible way to deal with bullying, you can see how those desperate to deal with "the bullying problem" might end up thinking that this is the simplest path to get people to stop screaming at them about bullying.

This is a key point that we keep trying to raise in the mad dash currently happening to put responsibility on platforms to "clean up" whatever mess politicians and the media see. There's this weird belief that the platforms can wave a magic wand and make bad stuff go away -- when the "easier" solution (if a morally questionable one) is to just figure out a way to hide the real problems or sweep them under the rug.

This is why I keep trying to argue that if we're highlighting societal problems that are manifesting themselves on social media, expecting tech platform companies to magically solve societal problems is not just going to fail, but it's going to fail in spectacular and awful ways. This TikTok "hide the people we think might get bullied" is just one example of sweeping a societal problem under the rug to avoid having to improperly answer for it.

Unfortunately, I fear most people will just blame TikTok for it instead.

Share This Article...



Image result for patreon

You Might Like


Image result for patreon


PLEASE DISABLE AD BLOCKER TO VIEW DISQUS COMMENTS

Ad Blocking software disables some of the functionality of our website, including our comments section for some browsers.





Login with patreon to gain access to perks!

SIGN UP TO GET BLACKLISTED NEWS DELIVERED RIGHT TO YOUR INBOX

Enter your email address:




More Blacklisted News...

Blacklisted Radio
Blacklisted Nation
On Patreon
On Gab
On Twitter
On Reddit
On Facebook
Blacklisted Radio:
Republic Broadcasting
Podcasts on Youtube
Podcasts on Demand
On Iheart Radio
On Spreaker
On Stitcher
On iTunes
On Tunein

Our IP Address:
198.245.55.242

Sponsors:
Garden office

good
longboard
brands


Advertise Here...






BlackListed News 2006-2019
Privacy Policy
Terms of Service