TheVoiceOfJoyce Of all the Social Media Platforms, Facebook has the worst record of content moderation.

Unfortunately, unless Section 230 of the Communication Decency Act is revised, no Social Media Platform is legally responsible for content on their site. In fact, the Social Media model is based on limited content moderation and the sale of our data and predictive behavior to 3 rd Parties.

Unless, we want to abrogate all control over our data, we must advocate for a change in the Regulation. The Legislation was written in the 1990’s , before the advent of Facebook.

Why are social media platforms dangerous? They allow election disinformation, violence, abuse and threaten the welfare of our kids and our Democracy.

Facebook says it does not allow content that threatens serious violence. But when researchers submitted ads threatening to “lynch,” “murder” and “execute” election workers around Election Day this year, the company’s largely automated moderation systems approved many of them.

Out of the 20 ads submitted by researchers containing violent content, 15 were approved by Facebook, according to a new test published by Global Witness, a watchdog group, and New York University’s Cybersecurity for Democracy. Researchers deleted the approved ads before they were published.

Ten of the test ads were submitted in Spanish. Facebook approved six of those ads, compared with nine of the 10 ads in English.

TikTok and YouTube rejected all the ads and suspended the accounts that attempted to submit them, the researchers said.

Leave a Reply