Wearables have long been touted as the next big tech front.
But to this point, results have been mixed — with the scale tipping towards bust rather than boom. The only real successes have been the Fitbit and Apple Watch. Google Glasses was perhaps the most infamous fail, though Meta refuses to let the concept die with its Smart RayBan glasses. Lol. Over the last year or so, we've seen our tech overlords push into a frontier — giant, neck-breaking headsets — the result of which has been a very expensive disaster. The Metaverse, the digital environment that these headsets would grant us access to, failed to take off, and with it, killed the most practical use case. Apple is entering the sector early next year with a focus on Augmented Reality instead, so you can… do your spreadsheets while standing in a room ignoring those around you?
The outcome will likely be the same. Why? Because wearable tech is an industry full of self-hype with very little to show.
Now, enter the A.I. grifters, who believe they can crack the wearable conundrum by stuffing devices with A.I. capabilities. Over the last few days and weeks, we've seen some glimpses of upcoming wearable devices, all similar: smaller, screen-less, and targeted at being a fashion accessory (one straight from the Apple playbook).
The most notable was revealed last week by a company called Humane. Formed by ex-Apple executives, it appears they are deploying the Apple playbook to a tee. Just as the Apple Watch debuted at Paris Fashion Week, Humane revealed the "Ai Pin" (a not-so-subtle nod to Apple with the lowercase 'i') at this year's show, adorning it on Supermodel Naomi Campbell as she strutted down the catwalk.
Humane's description of the devices almost scores the entire buzzword bingo card:
The Humane Ai Pin is the screenless, standalone device and software platform built from the ground up for AI. The intelligent clothing-based wearable uses a range of sensors that enable natural and intuitive compute interactions and is designed to weave seamlessly into users' day-to-day lives. The device is privacy-first, with aspects such as no wake word and therefore no 'always on' listening, reflecting Humane's vision of building products which place trust at the center.
What that means in reality isn't entirely obvious. Until we see the pin functioning in person, it's impossible to untangle what's real and what is self-generated hype. We've seen snippets of what it does in a tech demo held earlier this year, but, as with all tech demos, it was hard not to be cynical. How much was staged, and how much was actually happening? In that demo, the device translated some speaking and projected some text onto the user's hand. It could answer calls and access information but claimed it doesn't need syncing to a phone. And, of course, it has a camera, which will undoubtedly lead to some bad outcomes. A lot of unaddressed concerns.
We then saw a glimpse of the Pendant, which went up for preorders yesterday. Created by Rewind, it's described as a "wearable that captures what you say and hear in the real world and then transcribes, encrypts, and stores it entirely locally on your phone." Danger alert. It claims to offer features to ensure no one is recorded without their consent, but aside from the ability to pause recording, it remains unclear how it will do this.
Finally, we got to see the Tab. Released as a rushed prototype after news broke of the two devices mentioned above (only 100 were available for purchase), the Tab is a wearable A.I. device that "ingests the contents of daily lie by listening to all your conversations." It appears that Tab has to work with an app interface, and a user communicates with it by speaking aloud, with the device listening in. It's being pitched as an extension of your memory. Again, the "always on" listening is a huge red flag. During the reveal, the creator, Avi Schiffmann, claims, "It's, like, pretty secure."
How very reassuring.
Devices like these clash with the very functions of society.
Yes, they all claim to be data safe, storing captured data locally on your phone. But it's still being captured. It's still picking up surrounding noise, imagery, conversations, and potentially very private information. When Google Glass and even Meta's RayBans were revealed, there was considerable backlash fuelled by privacy concerns. People were spooked by the notion they could be interacting with someone who was filming them. The same applies to these technologies that record audio. In a way, it's almost worse, as they are far more discrete.
It creates obvious potential problems and obvious potential for misuse. Seriously, was the Rewind Pendant created in collaboration with the FBI? As one Twitter user wrote, devices like these will bring back sauna business meetings for fear of being wiretapped. The devices pose a serious question — how do they comply with privacy laws? Are they even legal? A person must consent to being recorded, so how will these devices ensure that? These details remain unknown, but we could hazard a few guesses. Perhaps they tap into A.I. to record their host's voice only. Or maybe they will rely on voice commands — though these would be needed from all parties being recorded, no? — to dictate how and when it functions. It all requires trust, something the A.I. world has yet to build.
Thanks to smartphones, we already live in a world where far too much is captured on screens, and plenty of what you've been up to has likely made its way into the background of someone's pictures or videos. But this feels like another step up. By masquerading as fashion accessories that are smaller and less obviously a tech device, it feels more secret and more devious. It's one thing to see someone holding a phone; it's another not to realize they're wearing a pendant or to spot the tiny camera pinned onto their jacket.
Worryingly, I've already seen many comments and posts saying, "it's easier if we just accept that everything we do will be recorded." To me, that's fundamentally flawed thinking. Most of us aren't happy with the concept of Big Brother, especially when it's controlled by governments and private entities. So why is it different when the devices are in the hands of civilians? Why should we accept that and not challenge the handful of people who insist on pushing it on us? For too long, the policy has been "opt out" not "opt in." Perhaps, if anything good can come from a slew of A.I. devices, it can be meaningful changes to laws that force these companies to be 100% transparent with what data they capture, why they capture it, and what they do with it. But that seems like nothing but a pipe dream. And without, with these devices running wild, the next wave of wearable tech comes with serious societal ramifications.
A.I. is pushing us to confront some serious, society-defining questions. The problem is that most of those building in the space, driven by greed, venture capital and a giant dollop of FOMO, aren't stopping to consider the answers.
If Google Glass users were being referred to as “Glassholes,” what will the users of these devices be called? Let’s hear your best ones.