I'm going to start with a disclaimer: I don't give two shits about the Monarchy and the Royal Family. If I had the choice, I'd move to abolish it and put those deeply unhappy individuals trapped inside it out of their misery.
I also never assumed I would write a single word on the subject. But this week's "Kate Middleton saga" — has anyone claimed KateGate yet? — has transcended politics and royalty and entered the tech sphere.
After announcing a "planned abdominal surgery," Middleton has been absent from public view. Without any real information, the internet has turned to doing what it does best — inventing its own. Some believe she's dead. Some believe she ran away. Others think she is trapped inside the now infamous Willy Wonka experience. As the Monarchy is all about public image, the head honchos sensed the time was right to release a picture proving Middleton is all good and well. A nice Mother's Day family snap would calm the situation, right?
Holy crown-wearing shit, did it backfire spectacularly.
No sooner had the picture appeared online were a swathe of online sleuths immediately attempting to debunk it as fake, altered or possibly even A.I. generated. Look at the child's arm! Look at Kate's face — it's the same as, checks notes, her own face, which was photographed for another magazine! Look at the blurry edging of the window frame! Why is the boy's finger so twisted? (Okay, I admit this one is weird). The list went on. Some photo detectives went overboard, claiming to have spotted countless discrepancies. The Associated Press even issued a "kill order" on the picture– an industry term for retracting an image.
It all but confirmed the image had been manipulated. But by whom?
Worry not, sleuths. The answer was simple and in front of you the whole time; Middleton herself edited it. In a small apology posted later, she wrote, "Like many amateur photographers, I do occasionally experiment with editing. I wanted to express my apologies for any confusion the family photograph we shared yesterday caused."
Drama over.
Well, not exactly.
The erosion of reality
As Charlie Warzel wrote, this is "an image that perfectly showcases the confusion and chaos of a choose-your-own-reality information dystopia."
Despite presenting a picture and then a straightforward explanation for its oddities, many still refuse to believe it. Some people believe it was edited by a nefarious third-party to cover up something dark; others are adament the Royal Palace has a secret room filled with tech nerds producing A.I. images.
KateGate is a real-time example of the problem society is approaching at rapid speed and with little thought — once we reach the point when A.I.-generated images become impossible to separate from real images, trust is going to be completely eroded in visual media.
With the rise of A.I. and the increasing speed at which it's been stuffed into every app and device in existence, this only goes one way — we're going to get to a point where nobody can believe anything. And even when evidence is provided, naysayers and disbelievers can chalk it up to "no, that's A.I."
What's real and what's not? Who is to say? How can we tell?
It's only going to get worse. A.I. imagery is already good enough to fool the untrained eye, and those pushing it forward will do so until it can create content that is impossible to tell apart. The consequences are worrying; if you can't trust a single image, how does society function around that? If we can't rely on photojournalism, can we trust the news? If we can't use an image to prove something, does it exist?
There's only one option — some kind of baked-in "this is a real image" verification. A time stamp, or a hard-coded signature that says when the image was created, who created it, and what its original version was. A digital fingerprint, so to speak. Some companies are already moving on this. Several high-end camera manufacturers are working with coalitions of publishers and other stakeholders to build digital signing technology into their cameras that 'signs' an image the moment the capture picture button is pressed.
It's a good start, but things need to move faster. Big Tech has a huge responsibility to help, especially those controlling social media platforms that amplify most of these images. I can envision an automatic community-notes-style pop-up, saying, "this image is an original," or perhaps flagging it as potentially doctored. Do they really care, though? Social platforms survive off of the traffic spikes that result from viral stories. Perhaps lawmakers could push A.I. companies to code a "made by A.I." watermark into the output its products create. I don't know what the solution is, but we need something to help us navigate the murky-as-hell waters ahead.
In a way, the KateGate scandal is more of a source of amusement for many, embarrassment for the Monarchy, and fuel for conspiracists. It's been surprising, perhaps shocking for some, but it's relatively harmless (providing she is indeed still alive...)
But with other photos, there's every chance that won't be the case.
On the Trend Mill this week
TikTok is banned! (maybe) — Today, the US passed a bill that will force TikTok owner ByteDance to sell the social media platform or face a total ban in the United States. It has 165 days to divest from TikTok; if it does not, app stores will be legally barred from hosting it. Wowzer. With TikTok so huge, it's a tempting acquisition for someone. The question is, who is buying it, and how many billions will it cost?
The power is in Altman's hands — In last week's edition of Trend Mill, I wrote about how OpenAI's response to the lawsuit filed by Elon Musk showed that openness was nothing more than marketing schtick. Altman has since been officially restored as CEO following an "investigation," and is now at the helm of a for-profit giant looking to gobble up money and power. The reason for his firing? Still unclear.
The numbers do lie — Despite what Elon Musk and Linda Yaccarino keep telling us, it seems X's traffic could be down more than 30% in the last year. According to new research, the social media platform also experienced declines in monthly and weekly usage. That's why we always take "official" figures with a pinch of salt; platforms only pick the metrics that paint the picture they want.