When it comes to writing novels and short stories, most know the landscape can be extremely competitive. For many, their dream is to write the next fiction bestseller. But even if they persevere and complete a full-length novel, they soon appreciate the further challenges involved. Publishers screen countless manuscripts each month, with the vast majority never seeing the printed page. But recent innovations in artificial intelligence offer opportunities for would-be writers, as many are turning to AI-written content and AI-generated stories as a less cumbersome way to navigate the publishing process.
(This Bold Business story was not written by an artificial intelligence.)
While potential authors happily experiment with AI stories, publishers cringe at the idea. Perhaps, if the AI-generated stories represented high-quality work, they would have a better perspective. But for most, these new trends mean higher manuscript screening volumes and lower yields. Even though identifying AI-written content is often easy, culling through flood of poorly-written manuscripts can soak up valuable time–time that would be better spent finding the true fiction “diamonds in the rough.” Thus, publishing companies are trying their best to come up with solutions to address this. Sadly, an ideal strategy has yet to be found.
“We are committed to publishing stories written and edited by humans. We reserve the right to reject any submission that we suspect to be primarily generated or created by language modeling software, ChatGPT, chat bots, or any other AI apps, bots, or software.” – Statement by Flash Fiction Online
The Recent Boom in AI-Generated Content
Over the last few months, the latest generation of AI chatbots have made major headlines, with various platforms like ChatGPT and Jasper being used to produce a variety of AI-generated materials. In some instances, this has been welcomed, especially for those writing programming codes and marketing materials. But in others, the use of AI-written content is more controversial–such as its use in educational settings. And now it seems these AI chatbots have touched the publishing industry as it relates to fiction. Many companies are being inundated with AI-generated stories, which are bogging down their screening processes. Some are actually reporting an increase of 15-20 percent in submission volumes since these AI chatbots were introduced.
(Dig into a comparative breakdown of the various AI-driven content creation apps in this Bold story.)
It’s worth noting that these AI-generated stories aren’t that great as a general rule. In fact, most editors and publishers can detect which ones are written by an actual person and those by a bot. On the one hand, most of the AI-written content is technically fine with good character development, setting, and conflict. But at the same time, storylines usually lack depth and a unique point of view. These stories also tend to provide too neat of an ending with melodramatic emotions. These limitations, along with a tendency toward repetition, usually alert publishers to the non-human source. But even so, the time spent making this determination is substantial.
“I don’t want writers to be worried that I’m going to miss their work because I’m inundated with junk. The good stories are obvious very early on. The mind that crafts the interesting story is not in any danger.” – Sheila Williams, Editor of Asimov’s Science Fiction magazine
Industry Impacts and Concerns
When it comes to the publishing industry, there are much greater concerns than those related to bad AI-generated stories. One of the more notable worries involves how AI-written content volumes might affect real submissions. Smaller publishing firms have limits on how many submissions that can accept. If too much of their submission quota is consumed by AI-written materials, then this may deter other authors. In this regard, publishers fear that the overall quality of available content may decline. And in time, this means fewer sales and fewer readers. Naturally, this could undermine reputations of quality.
At the same time, a flood of AI-generated stories also bogs down existing editors and staff with non-productive work. If more time is spent screening for AI-written content instead of real story reviews, this raises publishing costs. Greater time and energy will be spent to identify the same quality materials. As such, this could result in lower profits that demand higher sales prices. And for the publishing industry in general, this is not ideal. Having weathered the storm of the digital age, consumers are not likely to respond well to higher priced content. Therefore, this represents another reason why publishers wish to rid themselves of these burdens.
“Allowing authors to self-affirm if the work is AI-generated is a good first step. It provides more transparency to the whole thing, because right now there’s a lot of uncertainties.” – Matthew Kressel, Writer and Creator of Moksha online submission platform
Strategies and Potential Solutions Ahead
If an easy solution existed, publishers would obviously not be worried about the recent boom in AI-generated stories. One thing is clear, however. Simply allowing submissions of AI-written content to grow is not something that can tolerated long-term. Editors might be able to decipher AI materials from others. Timewise, this is hardly cost-effective. Instead, publishers would prefer to prohibit submissions on the front end rather than later down the road. But this is where the real challenges lie.
As a start, some publishers are requiring would-be authors to verify that their submissions aren’t AI-generated content. Naturally, these admissions are based on the honor system and hardly foolproof. Because of this, some publishers have decided to temporarily stop taking submissions until they can determine a better process. Perhaps, eventually, technology solutions may be available to rapidly detect AI-written content. Similar to the detection of ChatGPT in academic centers, this may eventually be a more practical solution. But until that occurs, publishers will have to likely rely on their own ability to detect AI-generated stories efficiently.