红玫瑰社区

Dangers of the Metaverse and VR for U.S. Youth Revealed in New Study

Teen Boy, VR Goggles, VR Headset, Metaverse, Dangers

32.6% of U.S. youth own a virtual reality (VR) headset (41% of boys vs. 25.1% of girls).


By gisele galoustian | 10/22/2024

The metaverse, a space where the lines between physical and digital realities blur, is rising among younger populations. As of March, 33% of teens own a virtual reality (VR) device and 13% use it weekly.

With the metaverse offering richer emotional experiences, youth may be particularly vulnerable to significant harm in these immersive spaces, underscoring the need to explore potential risks.

Unfortunately, research of online victimization in the metaverse is sorely lacking. A new study by 红玫瑰社区 , in collaboration with the University of Wisconsin-Eau Claire, is one of the first to examine the experiences of harm in the metaverse among youth in the United States. Using a nationally-representative sample of 5,005 13 to 17 year olds in the U.S., researchers focused on their experiences with VR devices, including 12 specific types of harm experienced, protective strategies employed, and differences in experiences between boys and girls.聽

Results of the study, published in the journal , found a significant percentage of youth reported experiencing various forms of harm in these spaces, including hate speech, bullying, harassment, sexual harassment, grooming behaviors (predators building trust with minors), and unwanted exposure to violent or sexual content. The study also revealed notable gender differences in experiences.

Among the study findings:

  • 32.6% of youth own a VR headset (41% of boys vs. 25.1% of girls)
  • More than 44% received hate speech/slurs (8.9% many times); 37.6% experienced bullying; and 35% faced harassment聽
  • Almost 19% experienced sexual harassment; 43.3% dealt with trolling; 31.6% were maliciously obstructed; and 29.5% experienced threats
  • More than 18% were doxed (publicly revealing someone鈥檚 personal information without their consent); and 22.8% were catfished (creating a false identity online to deceive someone, typically for romantic purposes)聽
  • Nearly 21% faced unwanted violent or sexual content; 18.1% experienced grooming or predatory behavior; and 30% were targeted for factors like weight, sexual preference, sexual orientation or political affiliation聽
  • Boys and girls experienced similar patterns of mistreatment, but girls experienced sexual harassment and grooming/ predatory behavior more frequently than boys. Boys and girls were equally as likely to be targeted because of their voice, avatar, race, religion or disability. 聽

鈥淐ertain populations of youth are disproportionately susceptible to harm such grooming, especially those who suffer from emotional distress or mental health problems, low self-esteem, poor parental relationships and weak family cohesion,鈥 said Sameer Hinduja, Ph.D., first author, a professor in the School of Criminology and Criminal Justice within FAU鈥檚 College of Social Work and Criminal Justice, co-director of the , and a faculty associate at the at Harvard University. 鈥淒ue to the unique characteristics of metaverse environments, young people may need extra attention and support. The immersive nature of these spaces can amplify experiences and emotions, highlighting the importance of tailored resources to ensure their safety and well-being.鈥

Findings also reveal that girls employed in-platform safety measures significantly more so than boys such as 鈥淪pace Bubble,鈥 鈥淧ersonal Boundary鈥 and 鈥淪afe Zone.鈥

鈥淲e found that girls are more likely to select avatars designed to reduce the risk of harassment and to use in-platform tools to maintain a safe distance from others. Additionally, both boys and girls feel comfortable leaving metaverse rooms or channels like switching servers in response to potential or actual victimization, although overall, youth tend to use these safety features infrequently,鈥 said Hinduja.

Among the recommendations offered to youth by the researchers include:

  • Using platform-provided safety features to restrict unwanted interactions and infringements upon their personal space. It is also essential that youth understand and take advantage of the safety features available within metaverse experiences, including blocking, muting, and reporting functionalities.
  • Continued research and development in these areas to determine how to meet the needs of users in potential or actual victimization contexts
  • Streamlining platform reporting mechanisms to ensure swift action is taken against perpetrators
  • Age-gating mechanisms for metaverse environments where mature content and interactions proliferate
  • Encouraging parents and guardians to take the time to familiarize themselves with available parental control features on VR devices and metaverse platforms to set boundaries, monitor activities, and restrict certain features as needed. An active mediation approach is ideal, where they engage in open and supportive dialogue with children about their metaverse experiences.
  • The integration of updated, relevant, and accessible digital citizenship and media literacy modules into school curricula to provide youth with the necessary knowledge and skills to navigate VR and other emerging technologies safely and responsibly
  • Consideration by content creators of the ethical implications of their metaverse creations, ensuring that they promote inclusivity, respect, and discourage any form of harassment. They should strive to make their virtual experiences accessible to users from diverse backgrounds, languages, cultures and abilities.

鈥淰R concerns of parents and guardians generally reflect and align with their historical anxieties about video games, excessive device use, its sedentary nature, cognitive development, and stranger danger,鈥 said Hinduja. 鈥淭here remains so much promise with these new technologies, but vigilance is required when it comes to the unique challenges they present as well as the unique vulnerabilities that certain youth users may have. As such, it鈥檚 鈥榓ll hands on deck鈥 to build a safer and more inclusive metaverse as it continues to evolve.鈥

Study co-author is Justin Patchin, Ph.D., a professor of criminal justice, University of Wisconsin-Eau Claire and co-director of the Cyberbullying Research Center.

-FAU-