Last week, metaverse-happy Meta (formerly known as Facebookrevealed a change to its Horizon VR environment: an invisible "personal space" boundary around avatars that keep others from getting too close.

That didn't take long, did it?

Recently, I wrote about Tim Cook's reluctance to dive headlong into the AR/VR space. Primarily, his comments dealt with security concerns. That's nothing new: Websites and applications have long struggled to get data security right, and with different national and regional requirements (like GDPR), it's an immense challenge.

But there's another potential problem with VR that Meta's recent change highlights: the behavioral danger of anonymity. This isn't a new phenomenon -- a study by Association for Psychological Science fellow Philip Zimbardo in 1969 showed a link between anonymity and abusive behavior. Joe Dawson surfaced the study in 2018, and it couldn't be more apt today.

Dawson notes subsequent studies, too, like one conducted in 2012 by Marek Palasinksi at the University of Lancaster, which revealed a different alarming tendency: When placed in environments ruled by anonymity, people are less likely to help others who are facing abuse or harassment.

Granted, these behaviors are not givens -- chatrooms dominated the internet in the 1990s (and beyond), and while some unsavory behavior occurred, it wasn't enough to shut down the internet. Many people built positive, lasting relationships in these environments. Others were able to open up and be truly themselves, something they struggled with in a hyper-critical reality.

But the opposite is also possible -- and given our past, probable, even. 

By all means, let's explore the possibilities of VR and AR. Let's see how they can improve our lives. But as we do, let's not forget two key elements of VR exploration: data security and human behavior.

As Meta has done, let's be quick to add safeguards, security measures, rules, and enforcement, where necessary, to make these new realities safe. If we don't do it now as innovation is firing, we may find ourselves in a very dark, unsettling place litigating sexual harassment, abuse, and violence in digital spaces.