If it quacks like content

I do use AI, mostly as a research aid, but I have lots of issues with it. Issues with the forced and unheeding acquisition and exploitation of others’ IP, issues with the carbon impact, and issues with inequality and inequity — even before we get to the potential disruption to creatives’ livelihoods. But I got particularly and acutely angry last night.

I saw an ad for a website building platform which now features generative AI, and in the ad we saw copy flowing magically and effortlessly into a box that promised what people would learn in the class that was being advertised.

But, you know… would they? This wasn’t the careful summary and positioning of a dedicated and passionate trainer who had developed a course to deliver key benefits to a customer, and was now laying it all out to attract them; it was spicy autocomplete guessing the what kind of things one could learn at a class like this.

Deployed well, of course, generative AI (never mind AI designed to sift insight from large amounts of data) can be a powerful and useful tool, but I worry so much that we have this tsunami of vaguely content-flavoured slop rushing towards us, and it will smash so much in its path when it makes landfall.