“Man bites robot dog."

At least, that was the headline that stuck in my head after reading a freakish Sacramento Bee story about an AI-equipped mechanical mutt being credited for solving a crime that never occurred.

True then, true now: “All I know is what I read in the newspapers.” 

Maybe less so thanks to declining newspaper readership.

But no need to go there, and, while we’re at it, no need to serve up yet another tediously predictable AI rant.

After all, LLM technology, entirely based on predictive pattern recognition, could likely deliver a reasonably serviceable level of bitter all by its own damn self.

But here’s the question that hallucinating robotic canines does bring to the fore— 

If the companies pushing AI are going to be responsible for changing everything, shouldn’t they change it responsibly?

To put this in the narrow frame of my chosen profession: if you're going to fuck with advertising, shouldn't you fucking make it better?

Sorry for the spicy, but really.

We see the hazards all too clearly: decimated creativity and declining originality. 

Frictionless production symbiotically linked to programmatic least-cost/lowest-value media purchasing. 

Vastly amplified, ergo even more universally ignored, message overload.

The same shite, only doubled down, that’s resulted in close to zero advertising-attributable growth among major brands over the last 16 all-digital dog years.

The middle finger of blame for all of this, of course, doesn’t rest solely, or even disproportionately on either the soulless digital platforms or, more recently, their AI-inventing counterparts.

The fault, dear clients and agency colleagues, lies not only in our magnificent seven superstars, but in ourselves.

So instead of backing up over roadkill, maybe we start asking ourselves, “what means better?”  

How about this for a spitball: we start by forever signing off on the addictive crack of the hype-cycle and settle on a quality-centric understanding of what present-level AI can and should (and can’t and shouldn’t) be asked to do.

Then we, as an industry, engage with the developers and talk about how we address the mile-wide and inch-deep crap content in way too much of the output.

Finally, we recognize this is both a “learnable” and “teachable” moment and that, just as Apple in its early glory days, did a whole lot of educating about whizzy new technology, the same need and same opportunity exists today.

Fantasy?  Maybe, maybe not.  

But if you listen closely, what the robot dog in the Bee story is saying is pretty freaking obvious.

If we allow ourselves to believe that good enough is good enough — in AI, or anything else — we’re just barking at ourselves.

Next
Next

1.Input: