World — a project involving giant orbs that scan your iris from the mind of everyone's favorite, trustworthy, "I promise this isn't for profit" tech guy Sam Altman — is launching today in the US.
The goal is to build a verified human network by turning scans of our eyeballs into a World ID, which will act as anonymous proof of human identity in the age of AI.
To borrow from Chandler Bing, could that be any more ironic?
Sam Altman, who once said about AI that "the bad case—and I think this is important to say—is, like, lights-out for all of us," and who also signed a statement saying "Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war," is now also the guy who wants to protect humanity from AI?
Take this quote from the World website:
"In the age of AI, World is a network of real humans built on an anonymous proof of human and a globally inclusive financial network."
Yes, Mr Altman, in the age of AI that you are creating! The goal of the project makes no fucking sense when it's spearheaded by the main guy behind the rise of generative AI, the tidal wave of slop it's producing, and the impact it's having on jobs and creative outputs.
Isn’t the conflict of interest there so blindingly obvious? I can't take much more of this.
Today, I'm sharing a slightly abridged version of a post I wrote back in the summer of 2023 when this project was first launched (outside of the US, because it obviously raises all kinds of privacy and regulatory concerns and rightfully so). It explains the premise and the concerns. I've added some more ranting at the end.
The crypto industry is rife with scandal, scams, rug pulls, shady characters and money-grabbing celebrities. The technology became the fundamental building block for world-changing concepts like NFTs and Web3 and other less-savory practices like helping organized crime move money and pay for illegal activities. Even the cases where it did something "well" have proven tainted — just look at golden boy Sam Bankman-Fried and the collapse of FTX, or Binance, which is now under investigation for alleged illegal practices, including fraud.
Everything remotely crypto-connected should be met with skepticism. Every new character who enters the picture claiming to "change the world" should be met with distrust until proven otherwise. None more so than the newest project on the scene, one that wants to scan your eyeballs to prove your "humanness" in return for crypto.
What could go wrong?
Worldcoin is a project co-founded by Sam Altman, then chief executive and now head honcho of the ChatGPT developer OpenAI. The ambitions are lofty — how else will you suck up that sweet VC money, of which Worldcoin has already raised $115 million — with the moonshot being to "enable global democratic processes" and eventually fund a universal basic income. The pitch includes the whole Buzzword Bingo card. The first line of the Whitepaper reads, "Worldcoin was founded with the mission of creating a globally inclusive identity and financial network, owned by the majority of humanity." Across the homepage, all the others are present: community, ownership and blockchain protocol. Check, check, check.
The project works as follows. Go to a location where an "orb" is present (the orb is the bowling ball-esque device in the feature image, without the devil horns and tail, obviously) and have it scan your eyeballs. This creates a "World ID," which Worldcoin describes as a digital passport proving you are unique and real while remaining anonymous. In exchange, you're granted 25 Worldcoin tokens, equivalent to approximately £40, which you can access in your crypto wallet through your smartphone.
Author note: I've no idea if you still get tokens to sign up for this next launch phase. If you do, CoinMarketCap puts the worth of one WorldCoin token at… wait for it… $0.002800.
On the one hand, there is some logic behind what the project aims to achieve. With A.I. continuing to blur the lines between what is real and what is generated, a system that could identify and validate actual human beings has merit. In theory, this "proof of personhood" could give the individual the ability to assert they are a real person and different from another person without revealing their real-world identity. If applied to other systems and platforms — banking or social media — it could combat the AI models, making it harder to separate the humans from the bots. That's a big if; it's rare that these projects ever find a tangible use case.
On the other hand, you've just given your iris scan, something unique to you, to a privately owned crypto company for the paltry sum of £40. Let me drive it home: the crypto landscape is full of bad characters and bad actors. Do we think handing billions of people's biometric data to a company operating in an almost unregulated industry sounds like a good idea? No, it's a fucking terrible idea and certainly a recipe for a dystopian nightmare.
There are numerous reasons to distrust the project:
Tools for Humanity (the company behind Worldcoin) is another crypto firm pivoting to AI to propel digital currencies back to relevance after a miserable 18 months of market crashes and bankruptcies.
Sam Altman's baby, OpenAI, was trained on data from people and companies without permission.
Despite the promise of equitably distributed currency, Tools for Humanity has said about a quarter of its new digital coins are already earmarked for venture investors and other company insiders.
The terms and conditions say it all: there will be no refund or compensation if digital tokens are stolen by "hackers or other malicious groups" or if there is an "intentional or unintentional bug" on the open-source software they use. In other words, if your eyeballs are stolen, well, we hope your Worldcoin tokens provide some comfort, even when they almost certainly become worthless.
In a glimmer of hope, some agencies are at least concerned about the population giving up its identity so easily. The UK data watchdog, the Information Commissioner's Office, is "making further inquiries." Privacy campaign group Big Brother Watch has flagged the risk of biometric data being hacked or exploited. The Electronic Privacy Information Center (EPIC) argues that despite receiving free tokens, users are paying a high price for the potential risks of hacking and exploiting their biometric data. EPIC counsel Jake Wiener warned of the consequences: "Mass collections of biometrics like Worldcoin threaten people's privacy on a grand scale, both if the company misuses the information it collects, and if that data is stolen." Madeleine Stone, a senior advocacy officer, told Reuters: "Digital ID systems increase state and corporate control over individuals' lives and rarely live up to the extraordinary benefits technocrats tend to attribute to them."
It's mental that this project even got off the ground; it's wild that it already has over 2 million "unique humans" in its network. The past few years of crypto are the biggest red flag you could ask for. Even before that, we learned the hard way from social media that our data shouldn't be a free-for-all for companies to exploit. We should do everything we can to protect it and be more selective about who and how we share it.
And yet, the moment "free money" is on offer, people are giving up their identity without so much as a second thought, hoping that those 25 tokens will be worth what, $5,000? $10,000? If Worldcoin goes tits up or gets hacked — and you bet it will be on the radar now — the consequences are not worth thinking about, and no amount of money will be worth paying them.
I'm going nowhere near that orb.
Alas, we know how this goes. Millions of people will sign up and give away more of their identity to a few tech overlords who "promise" to only do good with it.
He won't. With rumors about a super app coming, this World network, and of course, Open AI and its product suite, that puts too much power, influence and control in the hands of one very untrustworthy person — so snake-like his board tried to fire him. His constant overpromises and rhetoric about what AI will be and won't be, lack of transparency, as well as ridiculous predictions about how much money it will make when, in reality, it burns billions of dollars just to lose even more billions, all paint a picture of a person you do not trust with some form of future ID.
Giving this man and his company scans of your iris — keys to your identity — and taking his word on privacy and protection, all while he relentlessly tries to build the very technology this project is designed to distinguish us from?
You’re braver — or dumber — than I am.
No way. No way in Hell. Just looking at the 23&Me mess, as well as all of the bottomfeeder tech recruiters who ask for birth date and last five digits of my SSN “because the client needs it for their system,” I’d sooner trust my urethra to a pack of candiru than trust Sam Altman with my iris.
I hope all those rich venture capitalists lose all their money to snake oil manufacturing. Ai is already billions of dollars in the hole and probably wont ever make any money. Meanwhile the Musks of the world are being slowly exposed for what they are, spruikers not inventors. And here's Ai , selling us stuff we don't need or want, and wrecking the bits that might have driven innovation or made new stories: Art making. Just a bunch of American corporate clowns who would have passed on Star Wars to show The Other Side of Midnight, or more than likely real life Patrick Bateman's.