• | 10:00 am

Here’s how we can build a sustainable metaverse

We need an U.N.-like body to develop protocols that augment the trust and safety policies for every platform in the metaverse.

Here’s how we can build a sustainable metaverse
[Source photo: Ekaterina Bedoeva/Getty Images; fauxels/Pexels]

Every day, we read about another brand or technology investing in the metaverse. Virtual worlds like Decentraland—3D digital societies free of physician impediments—will revolutionize how we socialize, shop, seek entertainment, and more, and promises to be a major economic driver around the globe.

Less talked about is the metaverse’s digital sustainability. Web 2.0 taught us devastating effects that emphasizing growth over safety when building platforms have on the lives of people. Fortunately, a growing number of smart brands and platforms realize that trust is a strategic growth driver and brand differentiator. In other words, safety, privacy, and inclusion, the three pillars of digital trust, are not only for good but also for growth.

A $30 TRILLION OPPORTUNITY

It’s safe to assume that the metaverse will be a major economic driver all over the world. Early predictions suggest that, within ten years, the metaverse will grow into a $10 trillion to $30 trillion industry over the next decade.

But, if there’s one thing we’ve learned from Web 1.0 and Web 2.0, it’s that early predictions are vastly underestimated. Once human creativity and innovation are turned loose, it’s impossible to see where it will take us. That’s why it is so important that we don’t squander this opportunity by repeating the mistakes made with Web 1.0 and Web 2.0.

Sadly, we are already seeing similar problems arise, including racist NFTs and abuse in the metaverse. Within a week of Meta, the parent company of Facebook, allowing beta users to test its virtual reality platform, a woman reported being sexually harassed.

We’ve seen how abuse in social media can lead to life-altering consequences for its victims. In the metaverse, it will be worse. “[Virtual] reality plunges people into an all-encompassing digital environment where unwanted touches in the digital world can be made to feel real and the sensory experience is heightened,” The New York Times warned at the end of last year. It’s a warning we must heed: The more our identity and interactions occur in this immersive experience, the more severe the impacts of toxicity will have on our well-being.

NEEDED: A UNITED NATIONS FOR THE METAVERSE

The Second World War taught us the horrors of allowing terrible societal ills to go unchecked, including hate speech and othering. We vowed never again, and established the United Nations, which in turn developed protocols that promote societal and economic benefits and equality for all. These important protocols include the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights (1966), the International Convention on the Elimination of All Forms of Racial Discrimination, and the UN Strategy and Plan of Action on Hate Speech.

While adoption is voluntary, these protocols, written by experts who came together to share best practices, serve as guardrails that each nation can adapt to its own society.

We need an U.N.-like body to develop protocols that augment the trust and safety policies for every platform in the metaverse.

ASSETS IN OUR FAVOR

I am extremely optimistic that we can create a metaverse that is built on trust, privacy, and equality, because we have many advantages in our favor, chief among them is the wisdom of hindsight. Web 2.0 showed us precisely where the dangers lie, and that gives us the opportunity to address them in a holistic manner.

We also have a relatively small, but extremely knowledgeable cadre of professionals with deep expertise in trust and safety, privacy, and ethical technology.

As I told Time, the January 6th insurrection was the result of not putting guardrails into place 15 years ago. We know today that guardrails are crucial. We didn’t know that in the early days of Web 2.0. We are clear on that today.

What’s more, we have a commitment to do better. In 2021, I co-founded the OASIS Consortium, which is a think tank made up of trust and safety experts from community platforms, industry organizations, academia and non-profits, government agencies, and advertisers. (Recently, the OASIS Consortium established the first-ever User Safety Standards. More than a dozen companies—including Agora, Dentsu, and Grindr—have committed to these standards.)

These individuals realize the entrepreneurs who will power the metaverse don’t necessarily have the expertise to build safety-by-design, fairness-by-design, and privacy-by-design into their platforms. That’s okay; we will provide that expertise to them.

GUARDRAILS FOR WEB3

User Safety Framework built around the 5Ps: priority, people, partnership, product, and process, as all will play a vital role in safeguarding the people who will enter the metaverse full of hope and trust, as well as safeguard the trillions of dollars that will be invested in it.

  • Priority. This recognizes that trust and safety is mission-critical to the organization and establishes it as a central tenet of the platform. This requires a C-suite level trust and safety officer with the authority and resources required to ensure all users are safe and welcome.
  • People. This refers to the teams who will develop and maintain clear community guidelines, as well as moderate user behavior. It’s important that the people who will develop all policies are representative of all users, regardless of who they are and where they’re located.
  • Partnership. As mentioned above, there is a growing community of trust and safety experts and professional organizations (like OASIS) who are willing to share their insights and develop safety standards for the industry to leverage. Partnerships also include technology providers that can augment human moderation efforts.
  • Product. Safety-by-design, fairness-by-design, and privacy-by-design are essential to digital sustainability. Platforms need to deploy up-to-date technology that proactively enforces safe and welcoming behavior outlined in the platform’s community guidelines.
  • Process. Platforms need to define comprehensive processes for an effective trust and safety operation. The trust and safety community has generously shared a tremendous amount of insight that platform providers can leverage to ensure their digital sustainability.

Our vision is an ethical internet built with safety, privacy, and inclusion at its core, where future generations trust they can interact, co-create, and exist free from online hate and toxicity. I believe we can make this a reality.

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

Tiffany Xingyu Wang is the president and co-founder of the Oasis Consortium, a nonprofit devoted to making the internet safer. More

More Top Stories:

FROM OUR PARTNERS

Brands That Matter
Brands That Matter