A lot of popular tech platforms make use of recommender algorithms in order to capture people’s attention. With this tool, platforms can promote, but they can also demote posters quietly, and also completely hide content instead of deleting it or blocking it. This way, sometimes the poster doesn’t realize what’s going on for some time. This is called shadowbanning.
Most platforms will not admit or even deny that they practice shadowbanning, but there is a lot of evidence that it really exists. This kind of content moderation does not have a lot of oversight, and it is becoming problematic.
Shadowbanning reduces content’s visibility without letting the poster know. While the content can sometimes still be accessed, there are often restrictions on how it circulates. It could potentially not be seen in search results, news feeds, in recommendations, or in users’ content queues. It can come in the form of a comment being buried under many other comments.
The major platforms like Instagram, Facebook, and Twitter all deny that they practice shadowbanning, and will often use a technicality to get out of admitting it. However, they do tend to admit that they demote content and visibility reduction.
However, in 2018, Facebook and Instagram, now collectively known as Meta, admitted that they had used algorithms to reduce user engagement on “borderline” content, with Mark Zuckerberg describing it as “sensationalist and provocative content.”
In one survey, of 1006 people who post on social media, 9.2% reported that they believed that they were shadowbanned. Of those people, 8.1% were on Facebook, 4.1% on Twitter, 3.8% on Instagram, 3.2% on TikTok, 1.3% on Discord, and 1% on Tumblr. Less than 1% were on platforms YouTube, Reddit, Pinterest, Twitch, NextDoor, and LinkedIn.
Platforms tend to shadowban as a way of trying to moderate inappropriate content, but a lot of people criticize these companies for mishandling content moderation. The reason why this practice is so appealing to companies is that they can avoid accountability.