“Child Grooming and the Metaverse – Issues and Solutions”

I’ll admit that I haven’t joined in on the Metaverse experience yet. Someday maybe I’ll get there. Being that it’s an online platform, I know there are going to be risks for everyone using it, including “certain populations of youth (who) are disproportionately susceptible to online grooming, such as those who suffer from emotional distress or mental health problems, low self-esteem, poor parental relationships and weak family cohesion.” 

So what’s being done to protect kids? Many things are being instituted such as simple ways to report child sexual abuse material (CSAM), grooming and sexual ageplay.  Artificial Intelligence (AI) strategies are being developed but will also need to rely on “manual content moderation by Trust and Safety team members who work for each respective platform”.  Terms of Service and Community Guidelines will be put in place and have to be moderated by the platform. “In-app or in-game tools to protect users like Meta’s Personal Boundary or Microsoft’s Space Bubble,” will be implemented to assist in combating online abuse. 

One thing that stood out to me while reading this article from our friends at the Cyberbullying Research Center is that parents of children interacting in the Meteverse are going to have to be on top of their game to make sure that their children are using all of the safety features and reporting functions. Please take a minute and read this article so you can feel educated about what is next for your kids, clients or students.