One of our favourite plain language mottos adorns the wall in massive text at Write’s Wellington office. It’s from Sir Ernest Gowers’ book, Plain Words: ‘Be short, be simple, be human’.
It’s a motto that follows its own advice.
In the burgeoning age of ‘AI’ text generation, human writing for human readers is more important than ever. And that makes the WriteMark an even more valuable symbol of people-centric plainness.
Here’s why a quality mark for clear communication matters even more in the age of AI.
The WriteMark has always been a way to show your readers you care.
The heart-shaped symbol demonstrates your commitment to being clear, open, and customer-focused. It signals to your audience that you’ve gone the extra mile to ensure they understand what you’re telling them, which builds trust and confidence.
We think readers will particularly appreciate the WriteMark’s quality promise as AI writing proliferates. AI-generated text risks ‘infecting’ AI training data — the library of information that AI tools use to create their responses. This may degrade the quality of AI outputs over time, as they reinforce and amplify their own distortions and biases. Commentators have called this an ‘AI ouroboros’
In this uncertain future of AI writing, the WriteMark will signify people-centric writing that gives readers confidence and helps to form human connections between author and audience.
In a WriteMark assessment, qualified experts read documents, assess them against 25 carefully selected criteria, and produce a report packed with insights and recommendations. They apply a critical eye, drawing on their experience and understanding — as both writers and readers — to identify what works and what needs work. This experience and insight helps to shape documents that serve their writers — and their readers.
AI can do some incredible things, if you know how best to use it. By drawing from untold libraries of human writing and thought, it can generate convincing text and images in the blink of an eye. It can educate and entertain, adapting its tone and language for any conceivable audience. But AI is not critical, creative, or insightful — not yet.
AI can provide lots of helpful advice for some of the more mechanical aspects of plain language, like sentence structure and word choice. But humans can still do a few things better — like thinking.
‘Artificial intelligence’ is a bit of a misnomer, because tools like ChatGPT and DALL·E 3 are not thinking or creating. They draw on vast sets of training data from the web and use predictive patterns to spit out realistic answers to prompts.
This means AI would struggle to meet or assess some WriteMark criteria, especially big-picture elements. It takes critical thought to determine whether a document has:
AI is improving constantly, and quickly. But answering these questions requires critical analysis and holding the ‘big picture’ in mind — skills that today’s AI tools can only imitate.
Our assessors have another advantage over AI tools — their Kiwi cultural context and sensitivity.
AI tools draw on training data from all corners of the internet. This means they tend to replicate and reinforce existing biases in that data. Aotearoa New Zealand represents a tiny corner of the internet, so our cultural differences are easily overwhelmed by American and European norms in AI’s predictive patterns.
Why does this matter? One element we assess for the WriteMark is whether the document has an appropriate style and tone for its audience. Aotearoa’s cultural context is different from the rest of the world in lots of small ways — as well as the big ones, like the role of te reo and te ao Māori. The words we use and the way we express ourselves are distinct, as are our history, economy, politics, and culture.
AI tools are liable to get these small things wrong, because they draw from the wilderness of the World Wide Web. As well as setting the wrong ‘style and tone’ for our specific cultural context, relying on AI can lead to embarrassing and even offensive errors.
On top of using human experts to assess documents for the WriteMark, we get human non-experts to test how well a document serves its readers for the WriteMark Plus.
User-testing with real readers always uncovers unforeseen sticking points. Human testers can help identify things like:
AI is clever, and convincing. But there’s simply no substitute for testing a document with its target audience.
While human expertise can’t be beaten when it comes to the high standard of the WriteMark, we still recognise the value of this powerful tool.
That’s why Write has added an AI Writing Insights workshop to our roster, and why we’re keeping up to date with advances in the field.
Check out our workshop, AI Writing Insights: Balancing Opportunity and Risk