How AI Will Cripple Authors’ Livelihood versus the Uncanny Valley
Something lost in all the talk of ChapGPT, Bard and the newer programs focused specifically on writing fiction is that they all violate copyright law. AI isn’t generating original work, it’s taking work that is already done, tossing it in a blender and spitting it back out.
There are thousands of writers doing exactly that right now. Either they don’t care that they are stealing, or it hasn’t occurred to them. And this article, and all the other articles, and all the complaining on social media, isn’t going to stop them. It is a reality.
It reminds me, on a much larger scale, of the author who bought hundreds or thousands of paid fake reviews in the early days of Kindle. Before Amazon caught on and cracked down on it. But this is far, far worse. Magazines have already had to close submissions after being inundated with AI generated stories.
I submit Amazon KDP and other platforms are already getting AI generated books and will soon be inundated with them. Worse, will be the people who actively and deliberately violate copyright by feeding books they didn’t write into a AI program, simply hit the “regenerate” button and voila, they have their new book. This is happening and will increase exponentially. No matter how bad they are, this flood will hurt the earnings of authors who produce their own work by diluting the market.
This same applies to all written material. The WGA is on strike because their very existence is threatened by AI. Book authors have no such power.
It’s the Wild West now.
However, it is key to remember something about AI. It has no soul. That might sound philosophical, but it’s very pertinent. AI is not conscious. It is a LLM: large language model. It takes input, prompts, and then combs through what it already has in its databanks (almost all of it illegally copied or, worse, willingly uploaded by writers) and, if attached to the internet, from that great depository of knowledge and dreck. If you think about it, even if it steals someone else’s unique story, it is still a copy of that story. An echo. What does that mean?
There’s a term called the “uncanny valley”.
This was coined by a robotics professor, Masahiro Mori in 1970 to describe an innate human response to a robot that appears human but isn’t.
In this graph the x-axis equal human likeness and the y our emotional response. As you can see, initially as a robot becomes more human-like we do fall for it a bit with a positive response. But there comes a point where the response turns abruptly the other way and we might even be revolted but at the very least we feel unease.’
Courtesy wikipedia
That dip is the uncanny valley.
I’ve looked into various AI and AI writing software and experienced the same. The writing is, overall generic and bland. Occasionally there are intriguing phrases. Even some unusual twists. But overall, something is off. And that’s just for a scene. For an entire book?
Another aspect is garbage in-garbage out. Since the internet is now a vast wasteland of garbage and conspiracy theories and flat-out bad information, the AI is scraping that in along with valid stuff. Indeed, it’s uploading bad writing too. When I asked ChapGPT about myself it informed me there was a television series based on my Area 51 books. Which was news to me.
Indeed, it constantly comes back with wrong answers. Because there are so many wrong answers out there floating around on the internet.
One caveat though, is that we also have an overall dumbing down of the population because of all the misinformation on the internet. Flat-earthers can find other flat-earthers and no longer feel like the fringe. Look at the anti-mask, anti-vaccine movement? If one googles “what can go wrong with the vaccine” you get a ton of answers, none of which, of course, are what immense benefit a vaccine has had in saving lives. And thus, people dive down a deep rabbit hole of disinformation and ignorance. So perhaps many people are ready for AI driven content?
Worse, down the line, everyone can do their own AI driven content and no longer need anyone else’s. Why not be the hero in your own story?
Regardless, AI is here. It’s a reality. And we need to face that reality.
From Strebecklaw.com:
“One sticking point that often confuses non-lawyers is the question of what is protected by copyright and what isn’t. According to Section 102(b) of the Copyright Act of 1976, no “idea, procedure, process, system, method of operation, concept, principle, or discovery” is eligible for copyright protection.
Copyright law generally protects the fixation of an idea in a “tangible medium of expression,” not the idea itself, or any processes or principles associated with it. The nuances of this distinction are sometimes difficult to grasp, and the reality of the situation is that the facts of each case have to be looked at individually. The concept works as more of a continuum than a dichotomy.”
So I’m not so sure an AI generated document violates copyright. Plagiarism may be a more accurate description of what AI does, depending on how much AI text is pulled word-for-word from other peoples work. (Yes, ideas can also be plagiarized, but not copyrighted.)
And who owns the copyright to an AI generated document?
Interesting ideas to contemplate.
All good points. Plagiarism is the more appropriate issue. And much harder to find and prove.