The Case For Editors

In case you missed the news, artificial intelligence is on the loose, writing everything from book reports to scholarly research to social posts to song lyrics, all thanks to ChatGPT, the latest AI-driven text-generation tool.

The bulk of the coverage of ChatGPT focuses on how good the generated text really is. It’s just like human-generated advertising copy in a fraction of the time! It gets passing grades from a really tough professor! It reads like something Nick Cave might have possibly written (and subsequently thrown out)!

Well, fine. That makes it the word version of imitation crab.

You know imitation crab: It’s cheap white fish doctored up to taste vaguely like crab, and while it’s tolerable in a pasta salad or tossed in with some mac ‘n’ cheese, you wouldn’t dream of promising people a crab dinner and then serving the ersatz stuff as the main course.

ChatGPT is like that. It’s fine generating modern marketing copy to be tossed into the churning maw of the social-media machine (low bar there), but you wouldn’t want a whole book of it unless it was heavily edited first.

The fact is that ChatGPT and its kin are fairly helpless unless their inputs are guided appropriately by a human and their outputs are monitored by a human.

In that respect, ChatGPT is just another a typical hack writer parroting back predictable phrases in the latest three-lines-to-a-paragraph style, which just makes me melancholy. I’ve read enough lumpenprose in the last two years to last me until the sun collapses.

Photo by Matt Popovich on Unsplash.

AI needs an editor

You know what ChatGPT really needs? An editor.

Editors are an essential link in the creation and distribution of information – and even the best writer can benefit from a second or third set of eyes on their copy.

Red Smith wrote some of the cleanest prose on the planet, but he was never better than when Stanley Woodward gave his stuff the once-over.

Similarly, Ernie Pyle may have been the best newspaper journalist of the first half of the 20th century. His war correspondence was renowned, but his prewar travel writing was pristine – and benefited from having Lee Miller in the slot, touching it up before publication.

James Thurber needed Harold Ross, Thomas Wolfe needed Maxwell Perkins, John Hersey needed William Shawn, everyone needed E.B. White – the list goes on.

In the coming age of AI writing, we’ll need editors more than ever, not only to check output for factual accuracy but to humanize the content, to know when to apply the rules and when to ignore them. Otherwise, it’ll just be empty words cranked out in service of a brand that doesn’t know any better and doesn’t care to improve.

But the need for editors goes beyond that.

Person reading newspaper

Photo by Roman Kraft on Unsplash.

AI, editors, and the news game

I think a lot about newspapers and magazines. I feel their loss keenly; I cut my teeth on magazines, and still believe that a well-turned-out magazine is one of the highest forms of expression. It combines great design and great writing in something you can run your fingers over and cast your eyes across.

AI could be a savior for cash-strapped newsrooms. It could combine with social reporting and crowdsourced video to create an online medium that could reasonably stand in for a newspaper.

“Isn’t that Twitter or Facebook?” you might ask, and with reason: In that description, it is Twitter and Facebook. But add an editor, and suddenly it becomes a different creature.

Social-media channels have expressly stated they are not media and resent being characterized as such. They hew to the line that they’re merely a blank canvas for people to scribble on, to state their feelings, promote their rummage sales and celebrate their life events – and yeah, spread all kinds of hatred and misinformation.

They want you to blame the player and not the platform, but that argument falls to tatters at the slightest glance. The fact that Facebook and Twitter and YouTube have content-moderation teams, ineffective and understaffed as they may be, is irrefutable evidence that they’re in the news game. They just don’t want to accept the responsibility.

I believe that the key to an effective next-gen news platform lies in accepting the responsibility. Be the editor, and let it be known that you’re the editor. State that we’re not going to publish lies; we have strong feelings about what’s right and wrong; we believe in quality over quantity.

Again, the problem is not a lack of material. Between social and AI we can generate more than enough local news, and create a low-cost, sustainable model that can fund longer-form investigative stuff.

The question – and the challenge – is the willingness to edit, to gatekeep, to filter.

I'm 100 percent fine with ChatGPT and their ilk. I can think of productive ways to use them that don’t throw 10,000 copywriters out of work.

But more importantly, I'm 1,000 percent pro-editor. They’re the answers to some of our toughest creative problems.

Assuming the mantle

Unfortunately, it’s not as simple as blowing a bugle and summoning the editors. Becoming a good editor is a multi-step process.

First, you have to be a good reader. You have to be able to distinguish between good writing and bad (and now, AI-generated writing).

It helps to be a decent writer yourself, but not so wrapped up in your art that you try to make everyone else sound like you.

Having a broad knowledge base comes in handy when you have to dig out the disinformation.

Some empathy’s a good idea, especially with writers just starting to find their voice.

And finally, you need a thick skin to withstand the slings and arrows coming at you from righteously wronged writers, concerned citizens and outraged subjects of exposés.

That’s an unusual skill set, because editing isn’t quite as simple as changing a few words. If that were the case, heck, even AI could do it.

Nope, this job calls for a flesh-and-blood person, talented, fearless, and ready to learn.

So who’s willing to step up?