
Artificial intelligence is no longer confined to the realm of tech, it’s increasingly embedded in the world of books. From writing assistants to AI-generated cover art and machine-narrated audiobooks, the influence of AI in publishing is impossible to ignore. But as this trend accelerates, so too does the criticism. While some hail AI as a tool for creative empowerment, others see it as a shortcut that dilutes authenticity, exploits artists, and poses serious ethical and environmental questions.
Writing with Machines: A Creative Shortcut?
Tools like ChatGPT, Sudowrite, and Jasper are now commonplace among indie authors. Some use them to overcome writer’s block or quickly produce outlines and blurbs. A smaller but growing number rely on these tools for bulk content generation, sometimes pushing out entire novels in a matter of weeks.
But this convenience comes with a cost. Stories written, or co-written, by AI often lack emotional nuance and thematic depth. Readers can spot the difference. “It just feels empty,” one popular Threads user wrote about a Kindle-published novella they suspected was AI-generated. The prose may be grammatically perfect, but it’s missing the spark that makes writing feel human.
AI Cover Art: Judging a Book Before the First Page
One of the first signs that a book might have been created with AI? The cover. AI-generated book covers have flooded self-publishing platforms, often featuring uncanny facial expressions, distorted hands, or oddly generic landscapes. While some look polished at a glance, readers are increasingly trained to spot the tell-tale signs, and it’s putting them off.
For many readers, an AI-generated cover signals that the book inside might be equally synthetic. This perception creates a credibility gap that can hurt genuine authors who use AI minimally or ethically. Meanwhile, artists and designers are losing commissions as publishers and authors turn to cheap, instant solutions.
The Environmental Cost of “Effortless” Creation
Behind the illusion of AI’s speed and simplicity lies an inconvenient truth: training and operating large language models like GPT consumes vast amounts of energy. A study from the University of Massachusetts Amherst estimated that training a single AI model can emit more than 626,000 pounds of CO₂ equivalent, roughly the same as five cars over their lifetimes (Strubell et al., 2019).
This environmental impact rarely enters the conversation in the writing community, but it should, especially when AI is being used to mass-produce content that may never be meaningfully read. Are we creating more stories than the world has time to absorb, at the cost of the planet?
Authenticity, Ethics, and the Reader’s Trust
The heart of the criticism is this: AI disrupts the emotional contract between writer and reader. Readers invest time and money in the promise that someone has laboured over a story with care and intention. When that connection is broken, when the words feel algorithmically assembled, trust erodes.
Additionally, AI-generated content often relies on data scraped from human-created works, raising serious questions about originality and copyright. Laws are still catching up, but organisations like the Authors Guild and Society of Authors have issued statements urging creators to disclose AI use and calling for legislative protection (Authors Guild).
Conclusion: Useful Tool, Not a Creative Replacement
There’s no denying that AI has some valuable use cases in publishing. It can assist with brainstorming, accessibility (especially via audiobook narration), and even help neurodivergent writers organise their ideas. But it should not, and cannot, replace the soul of storytelling.
The challenge moving forward is not to banish AI entirely, but to use it responsibly. That means crediting it, limiting its scope, and staying honest with readers. In an age where books can be generated at the click of a button, perhaps the greatest act of creativity is slowing down, and writing something only a human could write.
Leave a comment