TheVoiceOfJoyce Is an American company capable of making decisions about our children’s online safety? The British coroner and the new regulatory arm of the British Government, have found against Instagram, in the suicide of 14 yr old Molly Russell. The Coroner decided that Meta was responsible for showing harmful content to a young teenager, 6 months prior to her suicide. In America, there is no Law regulating content appropriate for teenage viewing. In Australia, they’ve determined that content provided on online platforms must be regulated as well. Not only is content harmful to kids, it’s been separately found that data about our kids is shared with third parties and violates their privacy. In America, changing Section 230 of the Communication Decency Act, would make the internet safer for our kids and strengthen our Democracy. Regulation can act as a megaphone for truth and science and not a megaphone for disinformation, misinformation and violence. Think? Do you want a Corporation deciding what you see? Shouldn’t Corporations be accountable for their content? Wouldn’t you want control over who sees your data? Wouldn’t you want the ability to opt out of forced agreements? If you answered , yes, then the Social Media Platforms must be researched and their Algorithms made transparent., too!

www.theguardian.com/technology/2022/oct/05/techscape-molly-russell-inquest

What content is safe for a child to view online? Who gets to decide what is OK and what is dangerous? And how much of a role should a government, a company or a parent play in that decision?

These questions were brought into focus by the inquest into the death of 14-year-old Molly Russell. And if there was one point during the two-week hearing when the case for tougher online regulation became overwhelming, it was during Meta executive Elizabeth Lagone’s testimony.

The head of health and wellbeing policy at Mark Zuckerberg’s company was taken through a selection of the Instagram posts the teenager had viewed in the six months before her death – deeming many of them to be “safe” for children to view. It was not an opinion shared by many in the room at North London coroner’s court.

Molly, from north-west London, died in 2017 after viewing extensive amounts of online content related to suicide, depression, self-harm and anxiety. In what the NSPCC described as a global first, the senior coroner said social media had contributed to Molly’s death, ruling that that Molly had died from “an act of self-harm while suffering from depression and the negative effects of online content”.


Leave a Reply