Artificial intelligence isn’t intelligent, and artificial is an understatement. Adrian Michaels says it’s time for journalists to go on the attack instead of feebly defending their trade against computer invasion
Have you seen the latest thing on the internet with artificial intelligence doing something amazing? There’s a bloke speaking German, only he can’t speak German. There’s a poem about global warming written in the style of Harry Styles. There’s a picture of the Mona Lisa, only she’s doing farming, or working behind a bar, or something.
Amazing, yes, in a circus freak kind of way. And it’ll just be another day or so before someone else stares at me seriously in a meeting and asks if I, and all other journalists, will soon be out of work because machines will be doing it all. They’ve seen on ChatGPT 800 machine-generated words on why cricketers drop catches. Look! You ask a question and it all just comes scrolling in so fast. “Sometimes, fielders might underestimate the difficulty of a catch.” Amazing.
- AI hallucinations: why generative AI invents facts and how to manage it
- AI prompts: six tips for better results from generative AI
- The A-Z of artificial intelligence
Some artificial intelligence programs are being used to do desk research quickly. Some historic tasks of journalists will definitely be accomplished more efficiently as humans and machines combine their efforts. ChatGPT and its successors will get better and some of their more obvious mistakes – making stuff up and rendering the entire exercise dangerous, for example – will be ironed out. But for what the best journalists do – what they actually really do for their readers, their editors, their clients – the machines will be coming up short for a long time.
Drawing out the new
Recently we sat down with a client. They wanted us to concoct an article about something deep and complicated related to the future of global infrastructure, mobility, cities and energy usage. It could have, and has, been digitalisation of tax services in sub-Saharan Africa. Or a beautiful infographic about diversification of pension-fund holdings in Mexico. Or a podcast about marine preservation and the sustainable technologies that are interesting investors.
The client wanted us to listen to in-house experts and help them come to a hitherto unexpressed opinion, a new storyline that wasn’t easily discoverable online. These engineers and consultants and strategists knew they were good, and did great work, but they didn’t quite know where to focus their story, or what their audiences would most appreciate.
To tease that out, we had to be alive to the nuances in their speech and facial expressions, to pick up on half-expressed thoughts that might give rise to better insights. For that, it’s better if you’re human and in the room, interviewing and listening.
We are experienced journalists here at FirstWord. We help our clients find their voices. We ask for anecdotes to help colour in the gaps and bring stories to life; we ask for personal stories and tie experiences into the tales we are trying to tell together. We theme-monger, we expert-whisper. Crucially, we also steer clients away from topics and angles that don’t fit, that aren’t consistent with what the organisation has said before. We help to enhance rather than damage reputations, we stick with the right tone, style and length. The Economist doesn’t want to sound like The Sun and The Sun doesn’t want to sound like The Economist. Haribo doesn’t want to sound like Goldman Sachs.
Agreeing all that comes before a word is written, or a picture drawn, or a podcast recording device activated. Then we write well, with rhythm, with style, with linguistic flair, with humour if appropriate, with flourish and tempo and resolution. It’s not just good writing or an arresting turn of phrase: it’s a new lens on a topic; in horrible marketing jargon, it’s got cut-through, it will engage.
Human expertise remains critical
When we ghostwrite for strategy consultants, we don’t just analyse a trend in an industry, we help to find the consultants’ must-have Dalek line: “You, reader, if you don’t do this in the face of these changes, will be exterminated.” When we write op-eds, we know that the best are frequently contrarian. By definition, the views expressed will be different from what is already out there, from what an AI program can find.
There are many things that ChatGPT and other similar models can do to assist the process of journalism, and the trade will evolve. Thinking about all the thoughts and steps and tasks in finding, researching, writing, checking, publishing and distributing a story, there are programs out there to help. But they will always need expert wrangling to get the best out of them. The wranglers will be journalists, not rushing into phone booths with loose coins and dictating their copy on deadline, not changing the ribbon on their typewriters, but still very recognisable in spite of the digital enhancements.
If you want something bland, something that everyone else has already said, then use ChatGPT. You’ll get a ready meal, not a Michelin-starred banquet. And even with the ready meal you’ll need to check all the ingredients are actually edible as opposed to, oh I don’t know, paper clips.
AI does have its uses – up to a point
While ChatGPT can’t by definition have original thoughts – it simply puts words in front of other words, copying patterns that already exist – it really can improve search efficiency with clever prompting. Ask for 10 restaurants in Japan and you’ll get 10 answers. Tell the AI you’re planning to go to Japan, where you’ll be, when you’ll be there and what food you like, and then ask for restaurants, and you’ll get better recommendations.
An AI program can get a journalist started with ideas. Ask it to write 10 headlines or social media posts for an article and a list arrives in less than 10 seconds. You probably won’t use any of them, but you will have ideas that will lead to the headlines you do use.
And if a journalist needs to summarise a document, AI will help by making it shorter and pulling out some of the main points. But it might miss the most important ones, so you still need to read the big original document.
It’s commonly held that AI programs can only get better, and that they will do so very fast. But continuous and swift improvement is not guaranteed. Content producers are already taking steps to protect copyright and shield their work from the grasping eyes of AI.
Massive worldwide uptake of ChatGPT and other tools can even lead to a worsening of output by definition. When AI answers a question, it can only use material that’s out there, and it has a bias towards the stuff that’s most mentioned. As more and more of the ingredients for its ready meals come from material that AI has itself generated, rather than humans, so the material becomes increasingly repetitive and less insightful.
There is no doubt that for some ordinary non-journalism tasks – writing job descriptions, emails to colleagues, minutes of meetings – AI will save time and be a great relief to people who hate writing. And AI is there for people who just want to swamp the internet with poor, cheap content.
But ChatGPT can’t do what we do and there’s no need to be defensive about it.