Ready to get started?
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demoThe merest mention of "AI" and people seem unable to locate their usual critical thought processes.
As any remotely sentient user of information knows, it is essential to utilise a number of different sources in order to obtain the widest possible grasp of the subject being understood.
It's the same process in selecting the links of interest that we put in our newsletter. We aim to cover publishing in all its manifestations, and find things which both broadly or precisely affect our industry. For that, all kinds of sources are read and considered and proper journalistic rigour is applied in trying to get a good mix of information and perspectives.
What is apparent from this process at the moment is the extent to which AI developments, in all varieties, are dominating the publishing news agenda. This domination is of course true of many sectors of human activity at the current time, yet in our role as largely involuntary donors to the river of data required for LLMs to work, we are particularly exposed to it.
Yet the volume of coverage is out of all proportion to the real world effect. If anticipation could be bottled and sold, then the AI hype machine would cause a global glass shortage. Politicians, anxious to be associated with the current thing, offer us the promise of AI solutions that have the same relationship with daily reality that a Cro-Magnon tribal leader would have had by showing her top spearmen a mud sketch of the Burj Khalifa as a future housing option.
We've found ourselves in a situation where the Fear Of Missing Out has become a guiding principle, not a rational consideration. Incremental adoption of a new technology is how it's actually happening out there, in the places where it makes sense, and where it can increase productivity.
An interesting study from Anthropic is illustrative about what is actually occurring in the working world. Looking at real-world usage of Claude, they found that around 57% of categorised professions use AI for 10% of tasks. That sounds about right, as the for the non-expert user, such systems have an augmentation use case. That is, they can perform some functions well, and most likely functions at the more tedious end of work. They augment the human. All credit to Anthropic for actually allowing real data to be used, not numbers tampered with to excite stock markets.
The same study also illustrates the unreliability of predictions around AI use. It has been assumed that healthcare applications would a fertile area for such tech. However, as the study notes: "prediction of higher usage in industries like healthcare has not yet materialized in our data". That obviously isn't to say such systems won't find further use, but, like any other tool, they have to justify that use. No one uses a tool because it simply exists.
Likewise this week, and closer to home for publishers, a BBC study has shown that four leading AI chatbots, ChatGPT, Copilot, Gemini, and Perplexity, are unable to reliably summarise BBC-sourced news stories.
This lack of accuracy isn't of the nature of the fact-checking wars that dominate a large part of media discourse, those being of an ideological nature and about opinions. The inaccuracy illustrated by the BBC concerns good old fashioned proper facts, such as saying that particular UK politicians were still in power when they weren't.
According to Deborah Turness, the BBC's CEO of News and Current Affairs: "The AI assistants introduced clear factual errors into around a fifth of answers they said had come from BBC material. And where AI assistants included 'quotations' from BBC articles, more than one in ten had either been altered, or didn’t exist in the article."
The full research is here.
This isn't the future we were promised is it? In fact it's the not the future we are promised every day right now, and sadly many lawmakers have bought into it.
A leading politician in the UK, well, the Prime Minister no less, recently dangled the tantalising technological possibility of AI being used to fix the nation's road pothole crisis. What he was actually talking about was using data from roadside cameras to identify what needs to be fixed. In other words, an unnecessarily high tech way of identifying a rather obvious problem, rather than the resources to fix it. The proverbial technological magic bullet that will somehow fill in potholes instead of non-existent workmen.
Technology used as a distraction tool from issues that are simple to solve, but annoyingly require real application, effort, and focus. Sound bites rather than cement mixers.
There's a common theme of a detachment from reality in this. How it's supposed to be isn't how it is. A tool will become common use in a profession if it's the right tool for the task, not because it's new and shiny. A pothole currently requires humans to fill the hole in, or resurface the road, and as it turns out, a BBC sourced story is still best read on the BBC site.
Publishers then, need to be sure they're not opting in to the Fear Of Missing Out. On the strength of the above, the industry's prime function as providers of original content is still our defining feature, whatever technological noise there is, and playing to your strength is the route to success. Just don't drive into an AI pothole.
How does Glide Publishing Platform work for you?
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demo