Hundreds of people around the world are lining up around orbs to get their irises scanned for WorldCoin, Sam Altman’s world identity project. But when a Proof of Personhood can also net you a $50 return and promise you an anonymized world identity ID, there may be reasons to accept.
Hundreds of people the world over are lining up to get their irises scanned by looking into a seemingly bottomless steel orb. It’s not that they’re trying to peer into the abyss and that it stares back; rather, most are hoping to collect 25 units (~$50) worth of WorldCoin, the divisive cryptocurrency that’s the brain-child of Sam Altman and co (yes, that same Sam Altman from OpenAI and ChatGPT).
But while free money – even if it’s digital – is always an enticing prospect, questions surrounding the technology have given detractors more than ample reason to urge caution. It’s not everyday you have to give up your biometrics to enter the party – dystopian fiction has been written on much less impactful premises.
Carrefour de Vicente Lopez ahora mismo, todas esas cabecitas es para que les escaneen el iris en WorldCoin. Cuantas de esas Orbs hay en Buenos Aires? Ya vi muchisimas. pic.twitter.com/yREdziCIsIAugust 3, 2023
Of particular concern to detractors is the fact that WorldCoin aims to build up a “real human” database that’s both an identity and a financial network. Identity through iris scans; financial network through the WorldCoin app actually being a crypto wallet compatible with the project’s own WorldCoin token (as well as the big two of Bitcoin and Ethereum).
As Altman sees it, the world’s future hangs in the balance of being able to accurately separate human entities from non-human ones (such as bots, AI agents, or others) – something known as a “proof of personhood.” Human relationships depend on an unspoken trust: you are human, and so am I. When you can’t identify what’s on the other side, how do you openly interact with it? What are the rules of engagement? Is it a person with a history, pain and experience, or a chatbot agent trained to spread misinformation? In a way, the uncertainty of whether you’re talking to a human or a bot is a problem unto itself.
Of course, for AI companies such as Altman’s OpenAI, there’s also the “small” issue of data provenance. Until research conclusively shows that AI engines can be safely trained on their own outputs without going MAD, AI companies have all the interest in being able to separate what data has come from a human being (in this case, emergent data, or the data that’s naturally produced through the record of life) from the data produced by an AI matrix (in this case, referred to as synthetic data).
So there’s also a vested economic interest behind WorldCoin — one that seems to be built on a positive feedback loop between collecting profits from AI products, and distributing those AI-generated profits throughout the world’s humans. It’s good, then, that most of the project is open-source: public scrutiny seems a necessary and desired outcome. And there are reasons why WorldCoin isn’t still available in the U.S.
That database — and the where and how that data is being collected through WorldCoin orbs – are the oft-cited complaints against the project. On the darker side of the story, are reports of deception, exploited workers, and cash handouts, with forgeries and stolen identities already being present in a growing black market. On the better side of the story, perhaps we can already count the two million plus users that WorldCoin has seemingly built most of its database (to date) on.
The issue of digital literacy is present; it’s likely that some of the people scraping by below the poverty line of $2 a day were scammed out of their return – you still have to be able to sell the WorldCoins you receive in your wallet and convert them into the currency you actually need (however many steps that might take). That’s a lot of steps to take for some. For others, however, it will result in them receiving the equivalent of a month’s work. That can alleviate a lot of pressure, and perhaps even offer turnarounds for some human beings.
But then again, at what cost?
WorldCoin assures us that our identities are anonymized unless we expressly wish otherwise. This is achieved by encrypting the iris data as it’s being scanned: by operating as an L2 (Level-2) Layer blockchain built atop Ethereum, WorldCoin’s “Identity” data is encrypted using zero knowledge proofs. If correctly implemented, this method should be “flawless” at anonymizing the iris data, “scrambling” it into a data pattern that’s unique (and because you did it in person staring into the guts of a vaguely Oblivion-looking orb, there’s the proof of personhood).
But it’s easy to see why people could be wary: the possibility of your immutable, personal and intransmissible digital identification card based on your (encrypted) iris scan being leaked, stolen, or misused presents incredibly increased risks compared to losing your usual government-issued ID card.
This iris scanning becomes even more of a problem when you take into account that WorldCoin does want to become the world’s leading (we’d think unique) digital identity system, which invariably means that governments and other third-parties too will be able to verify your identity using WorldCoin’s system and infrastructure.
WorldCoin, like OpenAI, is one of those companies that launch such tremendously impactful projects that they’re almost certain to succeed – somehow, somewhere. For every person who dislikes the idea of AI or a digital identity system running on blockchain, another one sees potential in it. WorldCoin hopes (and expects) to hit one billion sign-ups by the end of 2023.