TheVoiceOfJoyce Yes, TikTok deserves scrutiny and regulation for misinformation and disinformation. Yes, it’s tied to China and there’s censorship of videos on “ banned subjects”, like Ukraine or the Uyghurs , it’s the algorithm that promotes content. The For You Page, uses an algorithm that aggregates all personal data and pushes videos toward an individual, intensifying their experience and in some cases, causes emotional harm and violence. Don’t single out TikTok, all the Social Media Platforms are aggregating our data, predicting our behavior and selling our behavioral patterns to third parties. We have no privacy. We may not want a corporation to decide what we see and what we choose. Nor do we want our data to go to Human resources, or insurance or credit agencies. How will the new Law, going through Congress, curtail the use of our data and behavior? How will the new Law Change Section 230 of the Communication decency act, allowing us to sue the platforms, if we’re harmed? Who will regulate the algorithms to stop the hatred, the incitement to violence, the self harm and mental confusion caused by the algorithms? What Law will stop the Harmful amplification of violent and abusive data? The algorithms either must be transparent for regulation to a standard or there must be enough sampling of data to enforce a change in the algorithms? Society is not safe with an unregulated Technology systems, that is not understood. How many people realize all their news is curated for them all the time? these silos of data, create a wedge in our Society polarizing our thoughts through algorithms design. The Social Media Platforms are not neutral and they’re unequal. Voices are not heard or amplified in a fair and just manner. The truth is not seen by the the many. Yet, big money supporting and promoting violence, insurrection, anti VAX’s &anti climate deniers are powered by money and an army of followers, making millions for themselves and the Platforms. Who will create a more neutral system?


“The China question to me is almost a red herring because there’s so little being done to protect user privacy generally in the US,” said Sara Collins, a senior policy counsel at non-profit public interest group Public Knowledge. “The thing I would be concerned about is the same stuff that we’re concerned about with Facebook or with Google. It’s their data privacy practices, what they’re doing with that data, how they’re monetizing it and what adverse effects are there on users.”

One measure that could start addressing those concerns is a federal privacy bill that is making its way through Congress. The American Data Privacy and Protection Act (ADPPA) would “actually create a privacy framework for all these companies that would affect TikTok and its business model,” said Collins, whose employer Public Knowledge works on content moderation and regulation issues. (Public Knowledge has accepted donations from TikTok.)

In the meantime, states are taking matters into their own hands. California passed a landmark child only safety bill that would require platforms such as TikTok and Instagram to vet any products that are geared toward children before rolling them out and implement privacy protections for younger users by default.

Marc Faddoul, the co-director of Tracking Exposed, an organization that keeps tabs on how TikTok’s algorithm works, thinks congressional leaders’ focus on the platform’s China connections misses the mark on pushing for more answers about the app’s algorithm.

Leave a Reply