Pages

Wednesday, December 14, 2022

A web content writer's take on AI

 My day job, as many of you know, is writing web content. Basically just writing informational articles for a company to put on their blog. It's a great job because I can do it at home, I can do it part time, and the money is pretty good for the time I spend on it. And, of course, I've been blogging regularly for 18 years, so you could say I have some experience.

But I realized recently that a lot of people don't understand what the job of writing web content is, or why it's a bad thing for this job to be automated.

Web content started to be popular when companies realized their websites weren't ranked as highly in search results unless they had recent updates on them. It made sense to run a small blog. At first, this was just something anybody at the company did, perhaps mentioning recent updates to their services. But more and more, they realized it could be helpful to hire someone to produce quality content and post it often. People searching for the answer to a specific question, like, "Why is my toilet overflowing?" would land on the page of Gold Star Plumbing, and as often as not, they'd go, "Huh. These guys seem to know what they're talking about. I'll give them a call."

As search engines developed more and more methods of finding good results, content writers had to keep up. This process is called search engine optimization (SEO) and involves stuff like including keywords, subheadings, and pictures to convince Google and its competitors that your page is high quality and has the answers people are looking for.

This led to a golden moment in the history of the internet. Companies with no real interest in educating the public for free still created lots of informative content for you to find and read, just on the off chance you might need their services at some point. And they employed armies of writers at decent wages to provide this service. Any big company tends to run at a certain surplus, and some of that got rolled into providing a real good for humanity. I love it when that happens.

So, my job consists of four things:

  • researching a topic so that I can provide accurate facts for the readers
  • putting the information in my own words--plagiarism is theft
  • explaining it clearly, in a way that's easy to read
  • optimizing the content so search engines can find it (and, by extension, my company)
If I failed at any of these, I wouldn't be doing a very good job. My main audience is robots--the little crawler bots that index things for search engines. And I feed them what they want (keywords). But I also do have real people reading my work, and hopefully they learn something from it. My boss particularly wants to provide financial education for free for those who can't afford to use our actual product (appointments with financial advisors). So all this time I've been working for them, I've built a knowledge base you can go into and learn about budgeting, investing, picking a good health insurance plan, that kind of thing. All in terms I hoped were easy to follow and understand.

Enter artificial intelligence. A bit of a misnomer: hardly anyone claims current AI is actually intelligent. It's a remixing machine. It takes in text that we feed it and produces more in the same style. It's a little mysterious because it's programmed to be self-teaching. It's not, strictly speaking, taught to use good grammar, for instance--it mimics us. That leads some people to think it must understand what it's doing. After all, humans learn to speak the same way and we understand what we're doing. But, as far as I can determine, there is no sign it understands any of what it says, can tell fact from fiction, or wants anything.

Of the jobs I do, AI is amazing at job number four, search engine optimization. And why wouldn't it be? The audience is robots, it's a robot. They are definitely speaking each other's language. Already, I've noticed AI-written articles leaping to the top of search results, especially if you're searching for something very basic or AI-adjacent.

So imagine the owner of a company. Currently, the company website publishes an article every weekday, at a cost of $100 per article from a freelancer, for a total of $26,000 a year. But now, they can buy a content creating AI engine like Jasper, which will cost them about $1000 annually for the mid tier package. The savings is amazing, you can hardly say no to a deal like that.

Now, the thing about Jasper is that it cannot tell fact from fiction. It can't even keep its facts straight over the course of a single article. It can, with a bit of trial and error, produce something readable, something that looks like real web content. But the gold kernel in the heart of it, the truth, is not there at all.

Here's an example I found while trying to answer a different question.

We are living in a world where artificial intelligence (AI) is taking over jobs that were once reserved for humans. The question of whether an AI can write a novel has been asked many times. And the answer is yes, an AI can write a novel. 

The question of whether or not artificial intelligence can write a novel is a difficult one to answer. It’s not just about the ability to generate text, but also about the ability to generate compelling and engaging plots. 

AI-novel writing software is still in its infancy stage and it is hard for them to create engaging plots, which is what makes writing novels so challenging. However, there are some use cases where AI writing software has been used successfully such as generating content for an online course. 

Some people might think that the quality of the novel would be lower than if it was written by a human, but this isn’t always true. There are many examples of novels written by AI that have received praise from readers. One such example is “The Day A Computer Writes A Novel” by Patrick Malone and David Levy, which was published in 1973.

It starts out okay. Not the world's best writing, a little cliché and redundant, but we're used to that already. (This is because a lot of web content already is of very poor quality, produced for low pay and in a hurry. The problem, you see, starts before we introduce the machine--a pattern we can see with industrialization everywhere.)

But then you start to notice contradictions. First the answer is yes, then it's difficult to answer, then it's still in its infancy, and then suddenly there's the wild claim that AI has written lots of novels that readers loved, even back in the seventies!

I can tell this is wrong, because a) I know that AI can't yet write novels, and b) this wasn't edited even enough to look convincing. But at times, some of this content may reach the level of being believable, and then it becomes truly dangerous.

Because AI can't think. It can't consider the truth of falsity of anything. You could probably train it to seek out legitimate sources, but it still wouldn't be able to tell what's fact, what's fiction, and what's speculation. It could go looking for sources about what to do about headaches, find a passage about paleolithic treatment for headaches, and recommend drilling a hole in your skull. Because hey, the encyclopedia said that's a thing!

Ah, you might say, but a human could oversee it. That's true. Very often we think of machines doing things, when really, machines are tools allowing humans to do more things in the same amount of time. Like the self checkout. It's not checking out my groceries for me. It's allowing one cashier to supervise the checkout for 10 people at once. This cashier is way more harried, doesn't have time to trade pleasantries with me, and I'm doing most of the work myself and having a terrible time, but--efficiency is improved, which is what capitalism cares about.

So, okay, the content writer has been fired and the boss is using Jasper. But the results are pretty wonky, so the boss calls the writer back up. "I don't need to hire you as a writer at 10 cents a word," says the boss. "But it says here in your ad that you'll edit for 2 cents a word." Okay, sure, says the freelancer. It takes her two hours to write an article, but only ten minutes or so to edit one, so she figures 2 cents a word is fair. 

Unfortunately, the job is way more than editing. She's having to fact check every claim and ends up rewriting most of it. It took an hour and a half, and she only made $20, bumping her hourly rate from $50 to $15.

So she complains to the boss. "This work is a lot harder than editing a human," she says. "It basically only did a quarter of the job, the SEO stuff. I want 8 cents a word at least."

The boss looks at the text Jasper wrote, and then at the text the writer edited. Sure, the latter is clearer, more accurate, and the only one a human would ever want to read. But then he thinks, "It's not my company's job to inform the public for free. I really only want to boost my site in the search results." He calls back the editor and fires her. Every day, a thousand more words of misinformation go up on the internet, and it doesn't hurt his bottom line much at all.

Already search engines aren't as good as they used to be at finding good material. Remember when you used to search for something and get sites run by hobbyists in the field, collecting every single fact available on the topic? Now all the top ten results are the exact same six facts, in the same obnoxious contentese ("Most of us have wondered about this question. To find out the answer, read on!"), covered in so many animated ads it's hard to find the text. AI will only accelerate the decline. And it will become even less possible to hold anybody accountable for plagiarism and misinformation, because nobody actually wrote the text at all. You can hardly blame the boss for something he didn't write.

Now I can tell you this is a bad thing that will take jobs from people and make the internet worse. But that won't stop it from happening. I'm not sure, at this point, that anything will. An informative internet was always a bit of a lucky break. We all benefit from it, but nobody makes money creating it except on the edges here and there.

I hope that many businesses will realize this stuff hurts their reputation. Mine, at any rate, is counting on attracting human eyeballs and educating real people so they are in a good financial position to eventually hire their services. Nobody shares an AI-written article with their friends except as a novelty. You can't go viral with that stuff.

I also hope search engines evolve a bit to reward good sites, whether it's by measuring what sites people click on, allowing us to report bad content, or learning to detect the markers of AI. They, at least, have everything to gain here, because their service is not usable unless they do. However, I worry that their real customers are the businesses they point you to, and they will be more worried about upsetting those businesses than about serving you machine produced word salad.

We will see what's to come. I find myself depressed that we finally got artificial intelligence, and instead of a kindly robot who can teach me the meaning of human existence, I got jumbles of words and surreal pictures with distorted hands. This was not the future science fiction promised me.

Related Posts Plugin for WordPress, Blogger...