In a letter to Facebook CEO Mark Zuckerberg, the group listed their main reasons for concern. They include research showing that social media can harm the physical, emotional and mental well-being of children; the cyberbullying already present on Instagram, which was experienced by 42% of young Instagram users; and the use of Instagram by predators to target children. In 2020 alone, Facebook and Instagram reported 20 million images of child sexual abuse.
Most importantly, children lack the maturity to navigate these complexities. They are too inexperienced to understand that some of the people they meet online have bad intentions.
Of course, many children use the Internet to communicate with friends before they can even spell correctly. Their use increased during the pandemic, when they couldn’t meet friends in person.
But image sharing platforms are different from texting platforms. They have more potential to affect children’s self-esteem and mental health.
And Facebook doesn’t seem particularly trustworthy. âFacebook’s priority is not to protect children; it’s a for-profit company that seeks to monetize the time spent, âTitania Jordan, parentage manager at online monitoring firm Bark, told the Washington Post.
Facebook also faces competition from platforms like Google-owned YouTube, which already has a version for children under 13. He won’t want to just leave that money on the table.