A charity has warned some metaverse apps are ‘dangerous by design’
Children are able to enter virtual strip clubs and witness simulated sex as part of Facebook’s metaverse.
Youngsters were able to “get naked and do unspeakable things,” take part in “erotic role-play” apps and mix freely with adults.
A BBC report found that grooming, sexual material, racist insults and rape threats were prevalent in one app.
As part of the report, a researcher posing as a 13-year-old was able to go into rooms where other avatars were pretending to have sex.
There were rooms full of sex toys, and other users were able to make rape threats to her avatar.
The NSPCC has said it was “shocked and angry” at the findings.
On an app with a minimum age rating of 13, the researcher was approached by a number of adult men and was shown condoms and sex toys.
The app, called VRChat, is an online virtual platform which users can explore with 3D avatars. It is not developed by Facebook but can be downloaded from an app store on Facebook’s Meta Quest headset, with no age verification checks.
Following the findings, the Children’s Commissioner for England, Dame Rachel de Souza, has said she is “really horrified” by the findings and has called out Meta for not making these online spaces safe enough for children.
She said: “I’m really concerned that Meta hasn’t made their metaverse safe by design – we have an Age Appropriate Design Code – and I expected better of them.
“Are you telling me that Mark Zuckerberg, with all his fantastic engineers and ability to create this, can’t keep children safe?
“That’s my challenge to the social media companies and they should be stepping up now.”
Children can enter strip clubs and mingle with grown men (BBC)Head of online child safety policy at the NSPCC, Andy Burrows, said there was “a toxic combination of risks.”
He said: “This is a product that is dangerous by design, because of oversight and neglect.
“We are seeing products rolled out without any suggestion that safety has been considered.”
Meta’s product manager for VR integrity Bill Stillwell said in a statement: “We want everyone using our products to have a good experience and easily find the tools that can help in situations like these, so we can investigate and take action.”
He added: “For cross platform apps…we provide tools that allow players to report and block users.
“We will continue to make improvements as we learn more about how people interact in these spaces.”
Related links:
- Woman says she was ‘virtually gang-raped’ in Facebook’s Metaverse
- Meta to suffer biggest one-day loss in corporate history
- Facebook has lost $500 billion since transforming into Meta