In recent weeks, social media giant Meta has been facing intense scrutiny over its plans to allow minors into its virtual world, the metaverse. Critics are raising concerns about the potential risks and harms to children in such a space, and are calling for Meta to halt its plans until more safeguards are put in place.
The metaverse is a term used to describe a fully immersive virtual world, where users can interact with each other and with digital objects in a way that feels like real life. Meta has been positioning itself as a leader in this space, with plans to build a metaverse that could eventually be used for everything from shopping to education.
However, Meta’s plans have been met with backlash from child safety advocates and lawmakers, who argue that allowing minors into the metaverse could pose serious risks. These concerns range from exposure to adult content and predators, to the potential for addiction and other negative effects on mental health.
One of the most vocal critics of Meta’s plans has been the nonprofit organization Common Sense Media, which is dedicated to promoting safe technology use for children. In a recent statement, the organization urged Meta to put the safety of children first, and called on regulators to step in if the company doesn’t take sufficient action.
“Meta’s plans to allow minors into the metaverse without adequate safeguards is deeply concerning,” said Jim Steyer, founder and CEO of Common Sense Media. “We know from experience that children are particularly vulnerable to the harms that can come from online spaces, and we can’t afford to take chances with their safety.”
Steyer’s comments echo the concerns of many child safety advocates, who say that Meta’s plans don’t take into account the unique risks that minors face in online environments. They argue that the company needs to put in place robust protections, such as age verification, parental controls, and moderation of user-generated content, before allowing children into the metaverse.
In response to the criticism, Meta has said that it is committed to creating a safe and secure metaverse for all users, including minors. The company has pointed to its existing safety measures, such as community standards and content moderation, as evidence of its commitment to user safety.
“We take the safety and security of our users very seriously, and we will be implementing a range of safeguards to protect minors in the metaverse,” a spokesperson for Meta said in a statement. “We will continue to work with experts and stakeholders to ensure that our metaverse is a positive and safe environment for everyone.”
However, critics say that Meta’s assurances aren’t enough, and that the company needs to do more to address the unique risks posed by the metaverse. They are calling on regulators to step in and impose stricter rules and standards on virtual worlds, in order to protect children and other vulnerable users.
Some lawmakers have already begun to take action. In the United States, a group of senators recently sent a letter to Meta CEO Mark Zuckerberg, urging him to reconsider the company’s plans to allow minors into the metaverse. The letter cited concerns about the potential for child exploitation and other harms, and called on Meta to put in place “robust and comprehensive” safeguards.
The debate over the safety of the metaverse is likely to continue in the coming months and years, as more companies enter the space and regulators grapple with how to regulate it. For now, though, it’s clear that Meta’s plans are facing significant pushback from those who believe that the company needs to do more to protect the safety and well-being of children in online environments.