Ready to get started?
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demoNew AI features introduced by Apple aim to bring the technology to wider consumer use. But it's in danger of treating human creativity as a problem to be solved.
Reading Apple CEO Tim Cook's recent Wired interview about the new Apple Intelligence AI feature being rolled out on devices, the conclusion I reached was that writers shouldn't write about AIs which do writing.
Here, of course, I am not writing about AIs which do writing. I am writing about writers writing about AI which do writing. Big difference. With that in mind, let's do it anyway.
The possibilities of Machine Learning (ML) systems are vast, and we're only just getting to grips with their use. In the week that Google announced its breakthrough quantum computing chip, I was reminded that no less than Demis Hassibis of the company's DeepMind division has recently suggested ML systems could be used for some applications previously thought only to be in the quantum computing realm, such as sub-atomic modelling, and more cheaply than quantum systems. That's technological promise.
Yet on the consumer level, while uses have been found for AI, one can't help but feel that for most people they fall into the novelty category, or relatively unglamorous data housekeeping such as emails or spreadsheets.
Apple are obviously trying hard to make their AI offering as consumer-friendly as possible, something Apple have typically excelled at. They're rarely first with a technology - the iPhone was by no means the first smartphone or phone with a touchscreen - but they aim to be a leader and standard setter from a product standpoint, and all power to them. A successful Apple idea can turn a struggling concept into something globally adopted.
Some of the things Cook said in the interview make me uneasy though. Writing is a pretty mysterious process, and in evolutionary terms is a relatively new one for our brains. For many people, and many of them smarter than me, it's a difficult task.
Is the answer what Apple are offering though? In the interview, Cook was questioned about an example usage Apple showed during a demo of the product in which a job application letter was improved by use of Apple Intelligence. As the interviewer put it to Cook, "If I’m a recruiter who hired that person, maybe I will feel tricked if they don’t live up to the professionalism of that letter?"
"I don’t think so. By using the tool, it comes across as more polished. It’s still your decision to use the tool. It’s like [us] collaborating on something - one plus one can equal more than two, right?" said Cook, assumedly not an AI-suggested answer established in interview prep.
Well Tim, no. Many of us will have sweated over such a cover letter at one time or another. Swapping one word for another, trying to pitch the tone somewhere between "I am a born conductor of the symphony of human endeavour" to "I will be a solid team player and do exactly what I'm told".
When an AI system is employed to smooth the edges of such a piece of writing, I fear a terrible uniformity will be apparent if you're the recipient of hundreds of such tinkered applications, unless the job being applied for is using AI to write cover letters.
Asked if this means no one will have to learn to write a professional letter now, Cook's response was interesting: "These worries have been around for years. I remember when people felt like the calculator would fundamentally erode people’s math ability. Did it really, or did it make something more efficient?"
Well Tim, no. The thing with mathematics in general use is that there is a right answer, and only a right answer. A calculator saves time in getting there. That isn't the case with writing, even if the writers of the Communist Manifesto thought different, for example.
There isn't a single thing I've written that I would write the same way again. There's not even a single piece I've edited that I would edit the same way again. That's the creative process, flawed but human, and capable of staggering achievements, or even minor ones.
Likewise, this week saw the first commercially streaming AI movies revealed to the press. All the expense associated with actually filming something, sets, crew, cameras, actors, can be done away with using the technology, or that's the aim.
The reality is something different. Watching the content, it becomes apparent that it's not very good, as 404 Media's writer explains: "The emotion that I feel most strongly is 'guilt', because I know there is no way to write about what I am watching without explaining that these are bad films ... and the people who made them are all sitting around me."
If you are in media and publishing, it's almost certain you have tried some sort of creative writing use of AI tools, or images, video, audio, etc. As handy as AI tools can be in a busy work day, I cannot say soul and art are ever apparent in the outputs. And isn't that what movies are supposed to have, at the very least? I know plenty of human-produced movies lack those attributes, but, still.
In 1922, during what can be considered the nascent era of film making, Nosferatu was released. The original remains a creepy watch, accentuated now by the age of the material and its methods of conveying terror. Next year, a remake of Nosferatu is being released, and it looks like it might be good.
If we agree that we are in the nascent era of AI, I get the feeling that no one in 100 years is going to be remaking any of the AI-created films and creative offerings we are currently seeing.
I will be interested to see when if ever that landmark is reached, but I don't yearn for it.
How does Glide Publishing Platform work for you?
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demo