Yesterday, the Sam Altman-fronted Worldcoin officially announced that it had raised $115 million dollars in venture capital. The raise looks like an atavistic last gasp for the kind of prestige-driven, slot-machine structured Silicon Valley fundraising fostered by a decade of cheap money. Because whether on ethical or financial grounds, there seems little rational explanation for supporting the project.
To review, Worldcoin’s pitch is essentially twofold. At its core is The Orb, a device that scans the retinas of users, so they can later confirm their identity online. The Worldcoin token, in turn, is intended to be distributed as a form of “universal basic income” (UBI) and is currently being offered as an incentive for early eyeball-scan volunteers.
In just one of many missing premises around Worldcoin, though, it is unclear how the Worldcoin token can be expected to have any value for recipients once it circulates. It is extremely difficult to imagine how what amounts to an Ethereum-based meme coin with no apparent tokenomic model is going to be exchangeable for essentials like food and shelter over the long term.
That makes it easy to deduce that the UBI element of the project is simply window dressing for its real goal: solving the problem of digital identity. But in fact, Worldcoin’s approach to that problem is equally terrible, presenting a dazzling array of privacy risks and moral entanglements.
This duality is just one example of the deviously incoherent mess of motte-and-bailey rhetoric being used to pitch Worldcoin. The company’s messaging goes to great lengths to depict both a charitable project and an opportunity for immense profits (a deeply troubling two-step Altman also pursued with OpenAI).
It is the apotheosis of Silicon Valley’s dangerous delusion that it can both get rich and make the world a better place through the mass harvesting of data.
Exploitation is generosity
The danger of that self-aggrandizing mindset has become more and more clear as Worldcoin goes from proposal to practice. Even in this early stage, it is planting the seeds of global havoc and mass exploitation, under the guise of Western generosity.
MIT Technology review interviewed dozens of participants in the early Worldcoin onboarding process going on right now in 24 countries, including 14 developing nations. Their findings were damning.
“Our investigation revealed wide gaps between Worldcoin’s public messaging, which focused on protecting privacy, and what users experienced. We found that the company’s representatives used deceptive marketing practices, collected more personal data than it acknowledged, and failed to obtain meaningful informed consent. These practices may violate the European Union’s General Data Protection Regulations (GDPR) – a likelihood that the company’s own data consent policy acknowledged and asked users to accept – as well as local laws.”
Meanwhile in China, a black market for biometric iris data has reportedly emerged among users hoping to join Worldcoin’s wallet app, and, it seems likely, collect Worldcoin rewards. According to sellers, the data comes from developing countries like Cambodia and Kenya. In other words, Worldcoin’s fundamental model is already incentivizing privacy harms.
This isn’t just a moral question, either: GDPR in particular is a very serious set of laws, with immense fines attached to violations. And while Worldcoin has downplayed the risks, their reliance on an army of Orb Handlers to onboard customers means manipulations will inevitably continue. That completely undermines Worldcoin’s promise to solve digital identity.
I’m reminded of a ‘70s-era cartoon from a Playboy Magazine clandestinely acquired in my antediluvian teenage years. The one-panel gag showed two lovers awkwardly entangled in hotel room bedsheets. Wedding rings on the nightstand imply they’re having an affair. The woman, whose exaggerated face conveys deep ennui, gets the punchline:
“Sam, darling – not only is this immoral, you’re doing it badly.”
The game is to be sold, not to be told
The $115 million fundraising round was led by a firm called Blockchain Capital. In conjunction with the announcement, Blockchain Capital general partner Spencer Bogart posted a short Twitter thread explaining the rationale for the investment.
The thread is cringingly vacuous and, intentionally or not, quite deceptive. Bogart opens by saying that he has “completely changed my mind” about his prior belief that “Worldcoin was some dystopian Orwellian nightmare” and a “noxious combination of hardware, biometrics, crypto and AI.”
But in the subsequent thread, Bogart offers absolutely zero rebuttal of those concerns. Instead, he simply argues that Worldcoin is “the most compelling solution we’ve seen to the decades-old [S]ybil problem” – that is, the digital world’s vulnerability to impersonation.
Given that he offers no reassurances about the downsides of this “compelling solution,” Bogart’s implicit argument is that putting the biometric information of disempowered people in the developing world at immense and fundamental risk is a worthwhile tradeoff for solving digital identity.
This is particularly regrettable because it seems to overlook a vastly superior set of identity solutions being pursued across the crypto ecosystem, by people far more genuinely focused on getting it right than Sam Altman appears to be. They include decentralized, privacy-preserving and user-controlled solutions that would lead to vastly better outcomes in the long run.
But they’re hard to explain, while Worldcoin’s pitch goes down easy – as long as you don’t think too hard about it.
– David Z. Morris
@davidzmorris
david.morris@coindesk.com