One fundamental critique of capitalism is that the free market cares more about whether a product can be sold than it does about whether a product should be sold.
This very critique immediately came to mind when I read a Washington Post article detailing how big tech companies like Google, Apple, and Facebook have all invested in creating a so-called “metaverse” similar to the OASIS from the dystopian film Ready Player One.
This would allow consumers to enter an alternate reality depicted by visual, physical, and auditory stimuli that would be like a 360-degree, fully immersive form of the internet.
Such a product is bad news for consumers. And while I am not about to advocate for new laws to prohibit companies from creating this technology — these have a tendency to backfire and often result in harmful central planning — I will say this:
Consumers should not buy into the metaverse. Period.
Perhaps my concerns seem undue — like I’m dwelling on a faraway fantasy and blowing it out of proportion, as many moral scares have over the course of history. But if the metaverse becomes a reality, it will threaten the existence of the very social interactions that make us human.
But just because a product is “ethically” produced doesn’t necessarily mean it’s right for the public to demand it in the first place.
But what if the damage to the user is less obvious than diabetes, cancer, or death? Immersion in the digital world is hardly harmless but the damage is harder to quantify.
While it is still unclear what the metaverse would look like, we already know that there is a risk of addiction associated with technology use. Some of the worst side effects include “attention-deficit symptoms, impaired emotional and social intelligence, technology addiction, social isolation, impaired brain development, and disrupted sleep.”
All of this should be especially concerning in light of the fact that most of us have spent the last year zooming our way into work, school, and play.
All of these problems, which are umbrellaed under brain health, exclude the fact that technology use also encourages us to be more sedentary, which brings a whole list of attached health problems for our bodies.
How much worse would all of these problems become if the digital world was fully surrounding our senses through virtual reality technology? While some versions of this product may foster physical activity, there is a significant chance entertainment-oriented virtual reality will tempt people to sink even further into their couches.
The essence of this concern is that, in many ways, the digital world is more appealing than the real one.
As a teenager, I can’t remember the number of times I turned to the comfortable hypnotizing effect of scrolling through my phone when I was upset or nervous. I know I’m not alone, Deseret News reported in 2019 that 43 percent of teens admitted to using their phones to avoid social interaction. This kind of behavior can hardly be healthy!
It is deeply ironic that Apple CEO Tim Cook, who is part of the movement behind the metaverse, once said, “If you’re looking at a phone more than someone’s eyes, you’re doing the wrong thing.”
He’s right. It’s commonly known that face-to-face interaction is far more satisfying emotionally than digital interaction.
But if they successfully bring the metaverse to market, you can forget about eye contact. It’s likely many people will disappear into the digital world entirely.
But all is not yet lost. Joining the metaverse’s fake reality will only be appealing if enough people do it to socially normalize that form of digital interaction and make it rewarding.
At the risk of sounding anti-technology, I am begging all of you now: if or when it becomes available, do not participate in the adoption of the metaverse.
Laura Williamson is a political writer and a contributor for Young Voices. She studied consumer behavior while completing her undergraduate degree in business administration. A born-and-raised Coloradoan, Laura is temporarily residing in New Haven, Connecticut where she is studying ethics as part of her master’s degree at Yale University.