Tech

The metaverse is the next venue for body dysmorphia online

Published

on


That doesn’t bode well for the metaverse, where avatars are likely to be the primary way we communicate and interact with each other. Noelle Martin, a legal researcher at the University of Western Australia and coauthor of a forthcoming paper on Meta’s metaverse, is raising just such concerns. “If people are able to customize their 3D hyperrealistic virtual human avatars, or alter, filter, and manipulate their digital identities, [there is] a concerning potential to impact body dysmorphia, selfie dysmorphia, and eating disorders … producing] ‘unrealistic and unattainable’ standards of beauty, particularly for young girls,” she said via email.

That fear is not unfounded. Facebook has been criticized for silencing internal research indicating that Instagram has a toxic effect on body image for teenage girls. A report in the Wall Street Journalfound that the app’s content focus on body and lifestyle leaves users more susceptible to body dysmorphia. But in the metaverse, where avatars will be the main way to present oneself in many situations, vulnerable people could feel even more pressure to adjust the way they look. And Martin says that customizable avatars in the metaverse may be used to “inflame racial injustices and inequities” as well.

Meta spokesperson Eloise Quintanilla said that the company is aware of potential problems: “We’re asking ourselves important questions such as how much modification makes sense to ensure avatars are a positive and safe experience.” Microsoft, which recently announced its own metaverse plans, has also been studying avatar use, though its research has been heavily focused on workplace settings like meetings.

The prospect of metaverse avatars for kids raises a whole other set of legal and ethical questions. Roblox, the wildly successful gaming platform whose primary market is children, has long used avatars as the primary means by which players interact with each other. And the company announced its own plans for a metaverse last month; CEO and founder David Baszucki declared that Roblox’s metaverse would be a place “where you have to be whoever you want to be.” Thus far, Roblox avatars have been playful, but Baszucki said that the company is pursuing completely customizable ones: “Any body, any face, any hair, any clothing, any motion, any facial tracking, all coming together … We have a hunch that if we do this right, we will see an explosion of creativity, not just among our creators but also our users.”

Ultimately, avatars represent how we want to be seen. Yet there is no plan for what might happen if and when things inevitably go wrong. The technology has to walk a fine line, staying realistic enough to be true to people’s identities without threatening the mental health of the humans behind the avatars. As Park says: “We won’t be able to stop the … metaverse. So we should wisely prepare.” If the Facebook papers show anything, it’s that social media companies are well aware of the health effects of their technology, but governments and social safety nets are behind in protecting the most vulnerable.

Crane understands the risks of more realistic avatars for those who might have body dysmorphia, but he says the power of being able to see himself in the virtual world would be indescribable. “For me, the joy of seeing myself represented accurately would mean that I am not the only person who believes my existence is valid,” he says. “It means a team of developers also see the potential of me existing, as I look, as a man.”

Copyright © 2021 Vitamin Patches Online.