The Rise of CreepTech
In simpler times, we never had to consider a loved one's privacy when gift buying.
But as technology has continued its often-thoughtless advance, many companies and the products they create have failed to deliver adequate data and privacy protection. Now, the tech gifts we buy each other are turning our homes into data-mining facilities. What was once an innocent speaker now listens to your mother's conversations. That cute AI Robot you buy your children to hone their emotional regulation and self-confidence also records and shares these conversations with Google and OpenAI. The exercise bike you get for your partner — if the gesture doesn't backfire spectacularly — will help them stay in shape while collecting and selling their data without permission.
The egregious practices of these companies are made worse by their attempts to bury the leaky privacy policies they poorly enforce. It's rare to see a company publish documentation on best practices for privacy when using it; it's rarer still that a company makes it easy to find this data. For most, it's intentionally hidden in fine print and spread across multiple pages and sites — if it exists at all.
Super creepy products
Mozilla's *privacy not included is a holiday shopping guide that puts users' privacy at the forefront. It's one of the best things about the holiday season because it names and shames the biggest offenders (my favorite). Unfortunately, it's a tool that will become increasingly important by the year as consumers realize they must fight for data protection. As lead researcher Jen Caltrider summarised, "While gadgets may be getting smarter, they are also getting creepier and way more prone to security lapses and data leaks — even among leading companies like Microsoft, Amazon, and Facebook."
The guide uses a Creep-O-Meter to rate products from Not Creepy to Super Creepy. While a handful of products handle privacy well (or at least meet Mozilla's minimum requirements), an increasing number don't.
It will come as no surprise that Meta, whose track record for privacy is in the gutter, finds itself in the latter category. The researchers found all of the company's products — Oculus, Quest, Portal, Messenger, Messenger for kids (yuck), Dating and the RayBan smart glasses — to be super creepy and slapped them with a Privacy Not Included warning. The worst offender on the list was the Facebook Portal, an AI-powered device with a smart camera and an Alexa-powered, always-listening microphone. A mix of a company that has stumbled from one scandal to the next and was once fined a record $5 billion for privacy failures and a monopoly that has its fingers in almost every bit of tech around? That's a bad combination.
Amazon's Alexa is not only tied to Facebook's products; nearly 30% of the devices reviewed are powered by the software, a worrying sign when the company admitted it collects some data even if a user tells it not to, and it holds on to said data even after a user deletes it. And they collect a lot of data. Creepiest of all, the devices are always on, eavesdropping as they wait for the wake word.
Last year's biggest surprise came from home exercise technology. Peloton, Nordic and SoulCycle were all labeled with a warning for collecting and selling user data.
This year, the surprise to me was the increase in privacy-iffy tech aimed at children. Just as the tech world is busy AI-stuffing its products, kids' toy manufacturers are doing the same. Toys with AI chatbots, or AI features, are becoming more popular, many of which collect and share data to train AI programs. Aside from the robot mentioned earlier, another area of concern is smartwatches, which are being marketed to parents of children too young for first phones. The downside? An abundance of privacy concerns, or in the case of the Angel Watch, the absence of a privacy policy that covers the smartwatch or app at all.
It's gross that even children are the targets of tech consumerism and, worse still, the victims of targeted data and privacy theft.
The consumer's choice: Privacy or product?
With the integration of AI into everything, the Metaverse, smart glasses and VR headsets, what does the future hold for our privacy? I asked Lead Researcher Jen Caltrider for her thoughts on where this is all headed. Her response was bleak.
"What are the prospects for our privacy long term if we stay on our current path? Not good. Our cars track our every movement, our headphones sense how our heads move, and our smartwatches can tell if we're pregnant or drunk. And with the growth of data-hungry AI and ever increasing surveillance, it will only get worse. In the United States (and everywhere around the world) we need strong, consumer-focused privacy laws to protect us from a dystopian future where privacy is a thing of the past."
It's a fair concern.
As we continue integrating the internet — and now AI — into more of our devices and cover our homes and lives with cameras, microphones, trackers, voice assistants and machine learning, the lack of data protection and privacy is becoming dangerous. The loser is the very individuals who use these products. While lawmakers seem to be waking up to the stark realization that many tech companies are too big and have too much influence and control, little will improve until they force change and hold companies accountable for our privacy and data rights. Why would companies take the initiative? The devices and gadgets listed in Mozilla's guide will sell by the bucket-load, just like they do every year.
But that, right there, is the one chance consumers have to turn the tide.
The change starts with us. Buying a gadget isn't simply buying a product anymore; it's a conscious decision about whether you trust the company behind it to work in your best interests and keep you safe. Your responsibility as a consumer is to be aware of this choice and act on it. In an ideal future, it won't require hours of research — whether done by the buyer or by organizations like Mozilla — to buy safe, responsible products. Just like we label alcohol and cigarettes with health warnings, perhaps one day, we can label products and services with privacy warnings. At least then, it can be the consumer's choice to weigh the pros and cons.
But until then, we have to take a stand.
We can demand more.
We need to demand more.
We must demand that privacy and security be taken seriously by the companies who build the products we buy and refuse to buy into the ones that blatantly disregard them. Yes, technology has vastly improved our lives, but that shouldn't come at the cost of our safety and our privacy.
So this year, give someone the gift of privacy.
It's the only way to make the internet — and our lives — a little safer in this digital world.