AI will transform aspects of content creation, but human expertise remains essential
Generative AI tools such as ChatGPT are set to revolutionise some content tasks, but the guidance and direction of human experts will still be critical for originality and to keep quality high
The sound of jaws hitting floors accompanied the mid-March launch of OpenAI’s artificial intelligence (AI) model, GPT-4. The new ‘generative AI’ demonstrated its ability to do everything from writing sonnets, to turning a napkin sketch into a working website and passing the Bar Exam in the US. It was a huge leap from GPT-3.5, which had itself been amazing ChatGPT’s 100 million-plus users with its extraordinary ability to answer questions and generate text.
In the same week came two announcements that are perhaps even more significant. Microsoft unveiled Copilot, an AI assistant that Microsoft 365 customers can use to summarise meetings and draft documents, and Google released a similar tool for its Workspace apps. Adding AI to apps that millions of us use every day at work could make us more productive while accelerating AI development. As Charlie Beckett, Professor of Media and Communications at the LSE, puts it: “If your mind is not boggled by the potential of this, then you haven’t been paying attention.”
However, advances in AI are not about to make a slew of office activities and outputs redundant. Think of generative AI as a smart intern. It will save time on lots of simple tasks. But without careful management the result will be sloppy work, as well as potential reputation damage and even legal trouble. Below, we dig into AI’s benefits for content development and planning – as well as its limitations and the areas in which it will not be able to imitate human specialists.
What generative AI does
The first step is understanding the technology. The term ‘artificial intelligence’ covers so many applications that it’s unhelpful. Plus, it conjures images of HAL from 2001: A Space Odyssey or The Avengers’ Ultron, depending on the era from which you take your cinematic references. Either way, those AIs turn against their human masters with catastrophic consequences.
Generative AI models are dull by comparison. GPT, which stands for Generative, Pre-Trained Transformer, is a large language model (LLM). It, and models like it, generate text, photos, video and sound using the huge datasets on which they have been pre-trained. Though OpenAI is cagey about GPT-4’s training, GPT-3 drew on a 45-terabyte database – roughly 2,000 times larger than the entire English-language Wikipedia. The final part of GPT, transformer, refers to a type of AI model that excels at dealing with large datasets.
But generative AI is not truly intelligent any more than an artificial leg is truly a leg. Both imitate some functions of their flesh-and-blood archetype, but neither does as much, as easily and with so little preparation.
How generative AI works
At its simplest, generative AI resembles autocomplete on your phone. Type “I put the cornflakes in the bowl and poured on the…” and the AI will probably add “milk”. Rarely, it might suggest “orange juice” or “sympathy”. A simple trick, but AI can repeat it with sentences, paragraphs and entire articles. Its uncanny accuracy comes from its understanding of context – how word choice changes based on who is writing, the intended audience and the type of publication.
This capability could transform everything from how we shop to supply-chain management. Before long, children might have AI tutors and – don’t think about this too much – people could have AI relationships. “Generative AI,” say venture capitalists Andreessen Horowitz, “will be the next major platform upon which founders build category-defining products.”
Content production is no exception. For example, if you need information from a heap of documents, ChatGPT can summarise them so you can decide if the original is worth reading. There are already AI tools that apply a chat interface to PDFs, which is great if you want to find key points in large documents.
It is also already better than search engines for some queries. If you want to understand something specific, such as what an “f-stop” is in photography, ChatGPT will answer in plain English, saving you clicks on numerous Google results. Then you can ask clarifying questions, such as “What is focal length?” or “What is depth of field?” Keep in mind that, for now, ChatGPT’s training data cuts off in 2021, so it can’t provide topical information.
Finally, AI can help with your writing. You can give it some research and tell it to turn it into outline, provide an outline and ask for a draft, or offer a draft and ask for improvements. Try asking it to write about your new product in the style of a leading marketer, an experienced journalist or Batman. Each change nudges the algorithms towards some words and phrases and away from others. However, it can also get facts wrong and be wildly wrong in tone, so it must be sense-checked.
A boring, hallucinating robot
Whatever you get back will probably surprise you with its quality, but it will also need more work. For a start, the writing is often bland compared to the work of a professional. It might be good enough for something formulaic, like a press release, but lacking for a content marketing article that needs to fire the imagination. AI writing is “boring”, according to Tony Ho Tran, of The Daily Beast. He writes: “The syntax is simple. There’s no style or flair.”
That’s because it is emulating its training data, which is mostly poorly written and ineffective. Producing better quality articles is why most people hire content marketers in the first place. At FirstWord our background is in newspapers, so we think of it this way: the intern might be given a short news piece to write, under supervision, for tomorrow’s paper, but there’s no situation in which they would be asked to write a big interview or think piece.
Furthermore, like that intern, generative AI has no meaningful knowledge about your business. It can’t say anything original about it because all it has to go on is its database. Nor can it ask you for the information it needs because it doesn’t know. It can’t build human relationships or interview experts. For example, an important part of our process is interviewing company leaders to uncover the story hooks that insiders take for granted, but which are compelling for readers. There isn’t an AI version of the process of drawing out some headline-grabbing thought leadership or providing an exciting new angle on an existing topic by way of some probing questions – and there might never be.
A trickier concern is that generative AI “hallucinates”, which is a computer scientist’s way of saying it makes things up. In Microsoft’s words, “Sometimes Copilot will be right, other times usefully wrong”. An answer is “usefully wrong” if it gives you an idea for a better example to use in your article, but if you don’t notice it’s wrong then it’s not useful at all. Then you’re in trouble.
You also need to check that it hasn’t swiped its material from elsewhere. US website CNET published numerous AI-written articles that turned out to be not just full of errors, but frequently plagiarised. This is another training data issue. The AI doesn’t know it shouldn’t lift a strong paragraph from an article in its database. Any human writer worth their salt does.
Why AI needs supervision
That doesn’t mean you shouldn’t use generative AI, just that you need to be careful what you use it for. Finding an original angle for a story, matching the article’s tone to the client and understanding appropriate use of sources are all things that an experienced writer can do faster and better than a non-expert using AI. Likewise, it takes an experienced editor to reanimate lifeless copy.
As with the intern, in some instances it’s a time-saver and in others you’d be better off doing it yourself or calling a specialist. As Santiago Valdarrama, a computer scientist, writes: “ChatGPT is flawed. I find it makes mistakes when dealing with code, but that is why I’m here: to supervise it.”
LSE’s Beckett agrees: “AI is not about the total automation of content production from start to finish: it is about augmentation to give professionals and creatives the tools to work faster, freeing them up to spend more time on what humans do best.”
Generative AI will be transformative for many tasks in every business. But unlike some areas that can be fully automated, content marketing is still a business of people talking to people. A new intern is a welcome addition, but the experts are still vital.
AI’s core strengths and weaknesses for content marketing
At FirstWord, we’ve never been asked so often whether a new technology is going to put us out of business. The answer is no. AI does many things well and will get better, but much of what we do is beyond its capabilities – and might always be. Below we summarise AI’s strengths and weaknesses from a content point of view.
What AI can do for content
- Research: You can be far more specific than with a search engine because AI understands detailed instructions. You can tell it what you already know and ask only for the missing data. But double-check your results because AI sometimes invents facts.
- Summarising: Give AI large text files and it can identify the main points or check for specific information. Tools like ChatGPT are limited in how many words they accept, but new tools like ChatPDF aren’t.
- Organising material: AI produces ideas tirelessly. Many will be useless, but some will spark inspiration. Try pasting in your notes and asking the AI for three possible article outlines.
- Basic drafting: Press releases, meeting summaries, emails welcoming a new hire and so on are usually formulaic. Give AI the facts and it can produce a rough draft for you to polish.
What AI can’t do
- Generating original ideas: Ask an AI for content ideas and you’ll mostly get familiar ones because it can only draw on what is in its dataset. True creativity still requires experienced people.
- Interviewing to draw out insights: Turning ideas into articles means interviewing experts to discover the details and examples that bring a story to life. AI can’t emulate even a novice interviewer’s skills, and despite the rise of Siri and Alexa, people still prefer talking to people.
- Bespoke strategy: No two companies are alike and it’s in the differences where interesting stories are often found. AI finds commonalities, not differences, so it isn’t good at finding niche stories that set a client apart from industry standpoints.
- High-quality writing: Our articles can sit alongside the highest quality journalism, placing them streets ahead of the mean. The sheer volume of average – and worse – content means it dominates AI datasets and therefore it’s what AI tends to reproduce.
AI will help content marketing, just as it will other businesses. And as with anything mass-produced by machines, the quality will be sufficient for some uses.
However, those who want to stand out will always seek material that is handmade by experts at their craft – and that means keeping skilled humans in the loop.