According to recent reports, Hachette, one of the world’s largest publishing houses, will discontinue selling Mia Ballard’s horror novel Shy Girl because the author has been accused of using AI to help write the book.
A Hachette spokesperson said that the publisher “remains committed to protecting original creative expression and storytelling,” adding that Hachette requires all submissions to be original.
Authors need to disclose whether AI was used during the writing process, otherwise it would be a contractual violation.
Ballard has denied using AI but has mentioned that an acquaintance she hired to edit an earlier version of the book might have run the manuscript through AI tools without her knowledge.
Earlier, too, fiction authors have been accused of the same practice. For example, an author using the pen name of Coral Hart confessed to self-publishing more than 200 romance novels in one year with the help of AI prompts. Hart, now an AI evangelist, somewhat brashly says: “If I can generate a book in a day, and you need six months to write a book, who’s going to win the race?”
Why do publishers and readers feel violated in such cases? Children’s author Malorie Blackman sums it up when she says: “Surely part of the pleasure of reading, listening to songs, watching films and dramas, looking at an artwork and, in fact, sharing any creative endeavour is that sense of connection with the content creator, that feeling that they are speaking to you on some deep, emotional level that is entirely absent when the work has been produced by AI.”
Quick Reads
View AllFor most of us, the bond that Blackman extols comes from originality, a quality also stressed by publishers such as Hachette. Especially with fiction, we like to think that the work is born of an individual consciousness, not merely something that’s machine-generated.
However, there’s another way of looking at it. Is originality really what makes prose and poetry stand out? Consider two obvious examples: Shakespeare’s plays and T.S. Eliot’s The Waste Land.
Shakespeare famously recycled already-existing plots and characters in almost all his work, and The Waste Land is a vast assemblage of quotes, lyrics, and other fragments.
Of course, it would be absurd to claim that this is all that these authors, and many others, did to make their work memorable. The quality of their prose and verse apart, it’s about the way they combined and transformed the material to make it resonant.
When we admire such works, then, we respond to the shaping intelligence behind them. A mind that has chosen, arranged and reworked elements to intended effect.
However, juxtaposing and transforming earlier material is exactly what generative AI, specifically Large Language Models, set out to do. So why should we prefer one to the other?
For a start, AI does not possess intention in the human sense. Its words do not arise from memory or feeling in the way a singular human voice does, weighted with experience and idiosyncrasy.
Further, the scale of the resources that AI has access to is vast, along with output that’s virtually instantaneous. These factors create text in which the human effort of shaping language is minimised.
The results feel like outright appropriation, even theft. (The copyright issues of where AI gets its data from are another debate.) What we want is writing that takes material and transforms it to make it deeper, stranger or even newer.
Not something that reads as though it has been merely reproduced, smoothened out or statistically optimised. The question, then, is not simply whether a work is original, or even whether the author has used AI. It is whether the work creates its own unique meaning, with choices that make it authored, not generated.
The challenge is not to reject AI outright, but to ask what kinds of writing still preserve those qualities. And whether authors and publishers are willing to defend them.


)

)
)
)
)
)
)
)
)



