Since AI-generated content first became accessible to the masses, many concerns have surfaced. Initially, some of the greatest worries were related to job displacement or worker obsolescence. Others involved risks to educational systems as a result of AI-supported plagiarism. While these worries persist to a degree, there now seems to be a growing fear of AI being used with harmful intent. There are several such AI uses ranging from the creation of malicious code to fake reviews undermining the trust economy. And one current example of this now involves AI generated travel books. From fake authors to fake accounts to fake travel reviews, consumers are being scammed. And there appears to be little anyone is doing about it.
The problem with AI generated books and content isn’t new. Essentially, since ChatGPT and other AI platforms were released, these risks and issues have existed. But the key problems with this isn’t the fact that AI is being used as a content generating tool. Instead, it involves AI’s tendency to create false information or to be manipulated for fraudulent purposes. This has been a growing problem, especially as it relates to online, self-published books in recent months. And while it involves many different genres of books, AI generated travel books represent the latest problem. Guided by fake travel reviews of these books, tourists are being conned into buying worthless resources. Examining this particular subset of books can therefore be enlightening as it relates to major problems with AI generated content.
A Flood of AI Generated Travel Books
If you’ve ever searched Amazon or another online book platform, any topic can yield hundreds if not thousands of options. In many cases, consumers filter these results according to their interests and other factors. One of the most common filters naturally involves reviews and their star-level rankings. Prior to AI platforms, such strategies were rather effective in reducing search options. But this is no longer the case for many items including travel guides. In the past few months, a flood of AI-generated travel books now occupies search lists. And thanks to fake travel reviews of these books touting dozens of 5-star reviews, they rank near the top. For the unsavvy purchaser, this often leads to buying a travel guide of little use.
The availability and accessibility of AI generated content is the major reason this is occurring. However, it’s not the only reason. Certainly, anyone who wants to create their own AI generated travel books can do so. But what makes this simple is the coexistence of self-publishing sites, easy-to-create fake reviews, and ample stock images for use. All the tools necessary are at everyone’s fingertips should they choose to use them. Not only does this pave the way for a growing wave of self-published authors. But it also invites others to create fake authors using AI that have zero accountability. This is how the current onslaught of fake travel books, fake travel reviews, and fake writers emerged. And it might be a stimulus to further increase the return of brick-and-mortar bookstores.
(Physical bookstores are making a comeback! Read all about it in this Bold story.)
Strategies for Detecting AI Generated Travel Books
In other sectors, the need for detecting AI generated content has been noted. Legal and judicial briefs that have been generated by AI have been identified with attorneys sanctioned. Likewise, companies like Turnitin are actively developing detection software for universities. Interestingly, there are now other platforms that are doing the same in a broader fashion. For example, the Accurate AI Content Detector and Plagiarism Checker by Open.ai takes any content and assesses for AI generated phrases. Overall, it has a 99% success rate in detecting such content including phrases in AI generated travel books. At the same time, its false positive rate is only 1.6% in assigning an AI label to human written text. These are the kind of systems needed across the board to enhance accountability.
Of course, most consumers aren’t going to sign up for such platforms, especially when subscription costs are involved. However, there are ways to detect AI generated travel books and others based on certain features. In terms of fake travel reviews and comments, these are typically very general in nature. They also may contain nonsensical words or go off topic. In author descriptions, past published works of AI generated authors are usually lacking or hard to find. Fake AI author portraits also commonly have distorted attire, blurry or abstract backgrounds, and unnatural elements. This is particularly true of the region around the ear of AI generated author images. Partial ear-rings or altered anatomy is not unusual. Understandably, this is more time-consuming for book buyers. But it could prevent someone from purchasing AI generated travel books that are useless.
A Call for Accountability
As likely appreciated, Amazon represents the largest online travel book seller in the world. And it also is the biggest culprit when it comes to fake travel reviews of their books. This occurs despite having an anti-manipulating policy regarding book reviews in place. In 2022, they reportedly blocked 200 million fake reviews with some resulting in book suspensions or banning. But based on the latest debacle involving AI generated travel books, it’s clear the policy or its enforcement is lacking. Not only are fake travel reviews being posted in large numbers, but fake AI generated travel books are as well. This is because Amazon has yet to develop an AI content policy concerning the books it sells. So as such, Amazon really fails to hold self-published authors or fake authors accountable.
Ultimately, it’s the consumer who is paying the price, both literally and figuratively. In terms of AI generated travel books, a customer will certainly lose the money spent should the book be of little value. But even more importantly, some of these travel books could lead tourists into dangerous or unsafe environments. At the very minimum, some verification process needs to be in place to let consumers know if content is AI generated. This same process should also inform them if authors are real or AI figments as well. And fake travel reviews as well as those involving other books needs to be eliminated. From a business perspective, this falls on the shoulders of Amazon and other similar platforms. Being proactive now will not only protect customers but also deter any unnecessary government oversight from inaction.
Did you catch Bold Business’ series on AI deepfake technology? Dig into one of the stories here!