What are ‘critical’ policies needed in regulating metaverse
The digital world is now considered a “very real part of users’ lives,” Deloitte found.
As the divide between digital and real places blur, companies and regulators face the tall order of protecting and ensuring the safety of users, particularly the younger generation.
In the 2023 Digital Media Trends Survey, digital “places” emerged as a “very real part of users’ lives.”
Deloitte also found that nearly half of younger generations (48%) spend more time interacting with others on social media than in the physical world. The younger generation comprises Gen Zs and millennials.
“Regulators are still considering the challenges of the Web2 era: endless content, data collection, privacy, and increasingly complex digital economies,” a Deloitte Insights report read.
“But are current regulations prepared to address the challenges and harness the opportunities presented by hyperscale metaverse experiences?”
To protect users in terms of content, conduct and safety, Deloitte recommended two critical considerations. One of these is ensuring that providers enable protections by default through user controls and policies for content and conduct.
This may be executed through restricting unsafe search results for younger users, blocking younger users from appearing in searches, and disabling direct messaging from unknown accounts.
Another approach is through content moderation, which may be more difficult and costly in the metaverse.
“As potential harms become evident, regulators are able to impose greater punitive measures on service providers. Providers should be paying attention to AI and large language models (LLMs) that may be better able to moderate at scale.”