My day job, as many of you know, is writing web content. Basically just writing informational articles for a company to put on their blog. It's a great job because I can do it at home, I can do it part time, and the money is pretty good for the time I spend on it. And, of course, I've been blogging regularly for 18 years, so you could say I have some experience.
But I realized recently that a lot of people don't understand what the job of writing web content is, or why it's a bad thing for this job to be automated.
Web content started to be popular when companies realized their websites weren't ranked as highly in search results unless they had recent updates on them. It made sense to run a small blog. At first, this was just something anybody at the company did, perhaps mentioning recent updates to their services. But more and more, they realized it could be helpful to hire someone to produce quality content and post it often. People searching for the answer to a specific question, like, "Why is my toilet overflowing?" would land on the page of Gold Star Plumbing, and as often as not, they'd go, "Huh. These guys seem to know what they're talking about. I'll give them a call."
As search engines developed more and more methods of finding good results, content writers had to keep up. This process is called search engine optimization (SEO) and involves stuff like including keywords, subheadings, and pictures to convince Google and its competitors that your page is high quality and has the answers people are looking for.
This led to a golden moment in the history of the internet. Companies with no real interest in educating the public for free still created lots of informative content for you to find and read, just on the off chance you might need their services at some point. And they employed armies of writers at decent wages to provide this service. Any big company tends to run at a certain surplus, and some of that got rolled into providing a real good for humanity. I love it when that happens.
So, my job consists of four things:
- researching a topic so that I can provide accurate facts for the readers
- putting the information in my own words--plagiarism is theft
- explaining it clearly, in a way that's easy to read
- optimizing the content so search engines can find it (and, by extension, my company)
We are living in a world where artificial intelligence (AI) is taking over jobs that were once reserved for humans. The question of whether an AI can write a novel has been asked many times. And the answer is yes, an AI can write a novel.
The question of whether or not artificial intelligence can write a novel is a difficult one to answer. It’s not just about the ability to generate text, but also about the ability to generate compelling and engaging plots.
AI-novel writing software is still in its infancy stage and it is hard for them to create engaging plots, which is what makes writing novels so challenging. However, there are some use cases where AI writing software has been used successfully such as generating content for an online course.
Some people might think that the quality of the novel would be lower than if it was written by a human, but this isn’t always true. There are many examples of novels written by AI that have received praise from readers. One such example is “The Day A Computer Writes A Novel” by Patrick Malone and David Levy, which was published in 1973.
It starts out okay. Not the world's best writing, a little cliché and redundant, but we're used to that already. (This is because a lot of web content already is of very poor quality, produced for low pay and in a hurry. The problem, you see, starts before we introduce the machine--a pattern we can see with industrialization everywhere.)
But then you start to notice contradictions. First the answer is yes, then it's difficult to answer, then it's still in its infancy, and then suddenly there's the wild claim that AI has written lots of novels that readers loved, even back in the seventies!
I can tell this is wrong, because a) I know that AI can't yet write novels, and b) this wasn't edited even enough to look convincing. But at times, some of this content may reach the level of being believable, and then it becomes truly dangerous.
Because AI can't think. It can't consider the truth of falsity of anything. You could probably train it to seek out legitimate sources, but it still wouldn't be able to tell what's fact, what's fiction, and what's speculation. It could go looking for sources about what to do about headaches, find a passage about paleolithic treatment for headaches, and recommend drilling a hole in your skull. Because hey, the encyclopedia said that's a thing!
Ah, you might say, but a human could oversee it. That's true. Very often we think of machines doing things, when really, machines are tools allowing humans to do more things in the same amount of time. Like the self checkout. It's not checking out my groceries for me. It's allowing one cashier to supervise the checkout for 10 people at once. This cashier is way more harried, doesn't have time to trade pleasantries with me, and I'm doing most of the work myself and having a terrible time, but--efficiency is improved, which is what capitalism cares about.
So, okay, the content writer has been fired and the boss is using Jasper. But the results are pretty wonky, so the boss calls the writer back up. "I don't need to hire you as a writer at 10 cents a word," says the boss. "But it says here in your ad that you'll edit for 2 cents a word." Okay, sure, says the freelancer. It takes her two hours to write an article, but only ten minutes or so to edit one, so she figures 2 cents a word is fair.
Unfortunately, the job is way more than editing. She's having to fact check every claim and ends up rewriting most of it. It took an hour and a half, and she only made $20, bumping her hourly rate from $50 to $15.
So she complains to the boss. "This work is a lot harder than editing a human," she says. "It basically only did a quarter of the job, the SEO stuff. I want 8 cents a word at least."
The boss looks at the text Jasper wrote, and then at the text the writer edited. Sure, the latter is clearer, more accurate, and the only one a human would ever want to read. But then he thinks, "It's not my company's job to inform the public for free. I really only want to boost my site in the search results." He calls back the editor and fires her. Every day, a thousand more words of misinformation go up on the internet, and it doesn't hurt his bottom line much at all.
Already search engines aren't as good as they used to be at finding good material. Remember when you used to search for something and get sites run by hobbyists in the field, collecting every single fact available on the topic? Now all the top ten results are the exact same six facts, in the same obnoxious contentese ("Most of us have wondered about this question. To find out the answer, read on!"), covered in so many animated ads it's hard to find the text. AI will only accelerate the decline. And it will become even less possible to hold anybody accountable for plagiarism and misinformation, because nobody actually wrote the text at all. You can hardly blame the boss for something he didn't write.
Now I can tell you this is a bad thing that will take jobs from people and make the internet worse. But that won't stop it from happening. I'm not sure, at this point, that anything will. An informative internet was always a bit of a lucky break. We all benefit from it, but nobody makes money creating it except on the edges here and there.
I hope that many businesses will realize this stuff hurts their reputation. Mine, at any rate, is counting on attracting human eyeballs and educating real people so they are in a good financial position to eventually hire their services. Nobody shares an AI-written article with their friends except as a novelty. You can't go viral with that stuff.
I also hope search engines evolve a bit to reward good sites, whether it's by measuring what sites people click on, allowing us to report bad content, or learning to detect the markers of AI. They, at least, have everything to gain here, because their service is not usable unless they do. However, I worry that their real customers are the businesses they point you to, and they will be more worried about upsetting those businesses than about serving you machine produced word salad.
We will see what's to come. I find myself depressed that we finally got artificial intelligence, and instead of a kindly robot who can teach me the meaning of human existence, I got jumbles of words and surreal pictures with distorted hands. This was not the future science fiction promised me.