We need somebody with a human touch
Ever considered going on a date in space? Perhaps you worry that your passion for accounting will not draw Cupid’s arrow. Or maybe you have a thing for topless bear wrestling … but are afraid of actual bears?
You may be in luck.
Flirtual is a virtual space that provides a diverse range of improbable dating experiences. This VR equivalent of Tinder lets users build dating profiles unlike any other. Freely choose your virtual dating identity, and enjoy romance whilst “swimming with sharks, chilling in a [virtual] cafe, or observing a black hole.”
This seems like harmless fun—playful settings, designed to realise equally playful fantasies. But these virtual identities merit taking seriously. Another platform, VRChat, made news over the impact virtual affairs have on offline relationships. What is cheating ... or even consent, when it happens online? We've been grappling with these nuances for quite some time: Second Life players encountered trouble when “cheating” on real-world partners with in-game ones. Some virtual affairs have led to divorce; others resulted in real-world marriage.
To understand what impact virtual identities can have on real ones, we need only point to two Second Life users, who famously set up a private eye agency to monitor other characters’ in-game activities for real-world clients. Methods involved setting a “honey trap” (using an attractive avatar as bait), following specific avatars incognito, and eavesdropping.
The expansion of such spaces, increasingly enabled by new immersive technologies, raises questions about how we should think about privacy online, both today and in the hypothetical “metaverse,” where different immersive spaces connect.
What do we expect in terms of privacy?
Let’s think deeper about what privacy actually means to us.
It’s easy to reduce privacy concerns to personal data sharing, but it’s a broader concept. It might be described as a basic shared belief that aspects of our lives should be discretionary: We decide who has access. Privacy in this sense is enforced through social convention: We hide parts of our bodies, choose not to disclose incomes, mourn in solitude, and consider the mundane details of family life to be personal.
There are exceptions. From nudist beaches and swingers’ clubs, to confessionals and support groups, certain spaces offer ways to selectively extend the generally personal to a wider group. Many social norms reflect double standards: Female nudity is more acceptable than male nudity in entertainment contexts, but the reverse is true in professional or neutral ones, where aspects of the male body are more acceptable. Instagram’s controversial censoring of women’s nipples, not men’s, exemplifies this. These choices are subjective, though it’s hard to understand who makes them or why.
But the word “subjective” is key. Helen Nissenbaum, a professor of information science at Cornell Tech, calls this “contextual integrity”—a concept in which privacy rights are dependent on “norms of specific contexts.” For example, we may consider it fine to be recorded on CCTV in a shop, but not in a changing room, at home, or in a confessional.
In online and fully virtual worlds, negotiation of space is more arbitrary, and surveillance tools are often invisible. To understand how privacy “works” for us online, we must wrestle with the fraught nature of anonymity, and the ability to disconnect information about our activities from our identities.
Privacy in Web 2.0
Anonymity has become central to the proliferation of user-generated content, through social media platforms, forums and online games—collectively referred to as Web 2.0, or the social web. This enables us to make a distinction between our “real-world” and online selves.
In principle, anonymity empowers people to express themselves freely online—by visiting websites that may be banned by local government, reading news beyond mainstream views, blogging about issues that could get them in trouble, exploring sexuality safely, or engaging in work that may compromise other work (to wit: in recent years, a few teachers—a historically underpaid group—have gotten sacked for having OnlyFans accounts).
Most technology platforms are built for anonymity, to an extent. “Anonymous identifiers,” like user ID numbers or the option to name accounts as you please, ensure we feel protected from unwanted identification. On an everyday level, we are satisfied knowing that other online users do not know that I am “LordElrond123,” or that my Facebook profile photo is a cute octopus (though I am neither Lord Elrond nor a cute octopus).
Most technology providers assign various types of information (from demographics to behavioural data) to anonymous identifiers, which can be tracked between accounts and linked to real persons. For obvious reasons, tech platforms are wary of disclosing the true identities of users to third parties. In the US, however, they can be required to by subpoena.
Tech platforms often fight such demands, but in many cases, they comply. In 2022, when the United States Supreme Court overturned Roe v. Wade, Facebook notably facilitated the charging of a woman who helped her teenage daughter secure an abortion, by releasing their private messages.
The metaverse of it all
While excitement about the metaverse has ceded space in recent months to fervour over AI, the concept remains an attractive ideal toward which many tech companies are still working. A more broadly-connected digital world—offering freedoms once reserved only to physical space—is an exciting idea, in great part because it’s easier to gather data and charge for access.
But questions are being raised about how new technologies and online spaces affect privacy, purely because of the vast amount of data that AR & VR use can collect about us.
It’s hard to conceptualise a metaverse. For now, let’s imagine the metaverse as a project to build a interconnected immersive world through a combination of augmented (mixed) and virtual reality. It's also a set of technologies, legalities and social norms that allow different online environments to connect across platforms (such as NFTs or digital asset ownership), to provide a continuous experience.
We remain distant from this ideal. AR and VR technologies are not as immersive, easy to use or affordable as they must be for popular adoption, and existing pre-metaverse virtual worlds are far from interoperable. It doesn’t help that the money thrown into the metaverse over the last handful of years has led to a proliferation of projects aimed at addressing imagined future customer needs, versus real ones.
Yet as attempts proliferate to build multi-functional virtual worlds, with complex internal economies, the nature of privacy concerns is changing. As our digital lives become more “real,” so do the risks.
Unlike less immersive digital spaces, like Facebook or Snapchat, pre-metaverse spaces VRChat and Roblox offer more opportunities for illicit or abusive activities on-platform. Since the identity of online users is not checked by most providers, safeguards are almost non-existent. One journalistic investigation revealed that VRChat allows underaged users into virtual strip clubs.
Other potential dangers include users mimicking virtual personas, stalkers who can track our virtual selves to learn more about who and where we actually are, cyberbullying, or cybersquatting—the practice of taking over an account or domain that appears linked to a real individual or entity before they have the chance to claim it, usually for a ransom if the latter is ever attempted.
Such incidents will make anonymity increasingly problematic. While avatars were initially hailed as a mode of freeing people from social injustice, sometimes they exacerbate those same issues. As with social media, where users can fake their age, gender or location, inhabitants of virtual worlds are not always what they seem.
My fragile relationship with my avatar
The nature of privacy is already changing as digital experiences become more interlinked and immersive. In a world populated by avatars performing different utilities—from virtual dating and gaming, to virtual work—my avatar herself will have privacy concerns.
The extent to which those concerns will be meaningful depends on the value we attach to our virtual lives. Today, online privacy is mostly about protecting our “real” offline selves from potential damage made to them online.
Take Metaphysic, a company that produces ultra-convincing deep fake videos of Tom Cruise, posted on a TikTok account that many mistook for the actor’s personal one. The technology is now being pipelined for use in a partnership with Creative Arts Agency to develop generative AI capabilities for the film industry.
These practices are as dangerous to brands as to individuals. A passionate former Kmart employee created a lifelike replica of a Kmart store on VRChat. This recreated environment was so successful that many people, including the creator, work shifts at virtual KMart so other users can “shop” there.
While neither example shows malign intent, both signal how easily real identities and brands can lose control over their narratives (and even their capacity to "play" themselves), thanks to rogue activity that impressively mimics them. Yet anonymity remains central to encouraging users to experiment with their own personas—which bring more value to our virtual selves.
One company built for that purpose is The Fabricant, a digital fashion house whose mission is, in the words of founder & CEO Kerry Murphy, to “build the base level infrastructure for a more democratic fashion industry.” It uses smart contracts—self-executing pieces of code, supported by the blockchain—to offer participants equitable ownership, splitting the value of each garment between the artist, the person that minted the item (for example as an NFT), and the buyer.
At the World of Women NFT fashion fest—a Decentraland event, where members can showcase their collections—garments from The Fabricant were airdropped into “crypto purses” held by people in the virtual world. This language is designed to empower women in Web3, by providing an alternative to male-oriented language like “crypto wallets” (a way of holding, tracking and exchanging cryptocurrency). Interestingly, many men joined the event, and enjoyed playing with their identity in ways they may perhaps not have considered in physical space.
The Fabricant is not worried about privacy; instead, it encourages a direct link between those who mint virtual garments, and designers and owners, by including all three on the blockchain on which each NFT sits.
Still, the psychological vulnerability of online actors will rise as the connection between my avatar and me grows stronger. If a large part of life is spent on any virtual platform, or linked series of them, we are likely to see issues around virtual reputation management, and the springing-up of services designed for emotional support.
Consider the “evil digital twins” problem—the idea that there may be copies, either of ourselves or our avatars, running around to bring us disrepute, or simply committing critical errors without human oversight, for example in manufacturing. The recent combination of AI and deep fake technology has created a rash of fake videos, featuring well-known people endorsing scams or conspiracy theories, that are increasingly indistinguishable from real recordings or endorsements.
In such an environment, how do we viably prove we are ourselves? Do we tacitly own our personas, or could more "productive" users of them win the battle against us?
Reputational risk increases as companies, such as Ready Player Me, allow users to create avatars that are fully interoperable between different VR gaming and metaverse environments, essentially allowing the user to maintain the same identity in different applications and settings.
The promised land of self-sovereignty
The solution seems to be to develop a more robust digital identity framework that somehow also helps us retain anonymity online. This has become the holy grail for regulators, tech platforms and decentralised applications alike. Implemented well, this could underpin a crucial sense of trust in new digital environments—key for the mass adoption of any new technology.
There are three types of solutions attempting to tackle this space: centralised private identity wallets, decentralised self-sovereign identity (SSI) projects, and a range of semi-decentralised government digital identity projects that apply some of the same principles as SSI.
CENTRALISED PRIVATE IDENTITY WALLETS
The first category is offered by providers like Facebook and Google, who use their brand identity as a surrogate service that can be transferred to other online environments, like logging onto Tinder through Facebook. These solutions have the advantage of being familiar and easy, but fall short of privacy expectations to come.
SSI solutions, like PolygonID, allow users to confirm their identity without divulging personal information to third parties, reducing the risk of a data leak or exploitation of the same.
The Web3 community hopes such decentralised (mostly blockchain-based) projects will move us to a promised land, where we can trust the authenticity of others without compromising anyone’s privacy. As Brian Trunzo, Metaverse Lead at Polygon, argues, “even though we’re moving from Web2 to Web3, an open metaverse means that identity fraud, data breaches, and theft are bound to continue and possibly even worsen.”
Governments are also looking to regulate this space. Trunzo admits, “While privacy and anonymity are two deeply held values in Web3, users will eventually have to identify themselves to use certain services or comply with Know-Your-Customer (KYC) regulations. However, KYC can present a severe data risk depending on how this information is stored.”
Murphy provides a practical example. “When someone wants to take a loan or a mortgage, they must meet certain annual income requirements to be eligible. However, unlike traditional references, SSIs allow people to just confirm the fact of their eligibility—without disclosing their actual income or employer, for example.”
This could be meaningful in countries like France or Germany, where, regardless of financial eligibility, a freelance worker may not be considered for a rental lease or mortgage, simply because their professional lives appear less stable than contracted workers.
Whether decentralised self-sovereign identity becomes successful depends on regulators. Murphy admits that SSIs are challenged to balance users’ capacity to control their data with public administration requirements. This is one of the main pistes that requires evolution. He also believes SSI solutions could “provide a suitable compromise for EU regulators who intend to enforce anti-money laundering laws regardless of their impact on privacy and anonymity.”
SEMI-DECENTRALISED GOVERNMENT DIGITAL IDENTITY PROJECTS
Simultaneously, digital identity projects are being pursued by regulators themselves—notably in the EU, where the European Commission launched a technical toolbox for citizens’ future digital identity wallets.
Murphy dismisses the idea that the EU’s digital identity project could replace private SSIs. “The principle of persistence, which implies that a user can freely dispose of their identity, can’t be fulfilled within the EU Digital Identity project when it comes to taxation, for example, because tax authorities need to know who exactly made each specific tax declaration.”
The jury is out on whether public adoption will value full personal data sovereignty. In the end, we perceive privacy contextually. Regulatory uncertainty may limit the development of a unified metaverse. The latter is itself arriving in a fragmented way, as legacy Web2 social media giants attempt to secure positions in Web3 by designing “walled gardens,” where they control all the action ... thus repurposing all the issues Web3 seeks to address. Consider Facebook’s transition to Meta.
The lack of a reliable digital identity solution will push companies to make new virtual spaces more restrictive by design. Meta limits how closely avatars can approach each other by default, following a series of virtual harassment scandals and concerns voiced by Oculus VR users.
It remains to be seen whether freedom of self expression, so keenly heralded by the communities currently involved in building and imagining the metaverse, will survive the compromises that must be made in order to deliver one.
Get the future in your inbox every week.
Header illustration by Debarpan Das.
Michal Rozynek is a contributing society and technology author at L'Atelier. He spends his time thinking about the interplay between social change, technology, and policy, and holds a PhD in Political Theory from the University of Edinburgh, as well as an MBA from the University of Oxford—having published on topics ranging from nationalism to financial inclusion. He is also a veteran of financial technology, and founder of his own growth strategy and innovation consulting company, focused on fintech and payments.
May 18 2023
Programming trust: Web3 could break programmatic ad tech's spell. Are we ready?
Certain priorities were etched into the Web 2.0 advertising ecosystem. Web3's more thoughtful programming, rewarding community cohesion and robustness, could change what we expect from technology and each other.Read more
Aug 11 2022
Me, myself and my avatar: Data ownership in virtual worlds
If you’re reading this, perhaps it’s no surprise to you that there is a commercial and strategic value to the data we generate all day, whether online or offline. In terms of the sheer volume of that potential, these are unprecedented times: 90 percent of existing data was generated in the last 2 years.Read more
- Artificial intelligence
- Tech vs. Tech
- Digital Adoption
- Community Shifts
- Virtual Economy
- Data and Privacy
- quantum technology
- Grey Markets
- Virtual Jobs
- Social Mobility
- Social Media
- Tech Briefings
- Virtual Assets
- Conspiracy Theories
We are an interdisciplinary team of economists, journalists, scientists and researchers interested in emerging technology related cultural phenomena and change.