No rush to welcome AI into our newsroom

I couldn't help but notice the headline of a Poynter article in a recent "E&P" digital newsletter: "AI is already reshaping newsrooms, AP study finds."

The real shock was in the subhead: "Despite ethical concerns, nearly 70 percent of newsroom staffers recruited for an Associated Press survey say they're using generative AI to create content."

As I'm sure you might guess, I fall into the 30 percent who are not using generative AI. Apparently, others are using it to create social media posts, headlines, newsletters, story drafts and more.

I should admit right now that even if I wanted to use AI to do any of those tasks, I would be in trouble, because I don't really know how to. More importantly, I don't want to use AI to do my job (except maybe for headline writing!).

According to the article, if we are going to stay relevant, we must become familiar with AI. I am going to count writing this column toward that effort. I am now familiar with the fact that the AP released guidelines last summer on how it uses generative AI like ChatGPT.

"The internal guidelines, which stress the importance of human editing, warn about the myriad pitfalls of generative AI: its tendency to 'hallucinate' and produce misinformation, the ease at which bad actors can produce disinformation and privacy issues concerning what users put into ChatGPT," an August 2023 article by Poynter's Alex Mahadevan states.

This does not encourage me.

Amanda Barrett, AP vice president for standards and inclusion, cited "accuracy, fairness and speed" as the company's guiding values.

"(W)e believe the mindful use of artificial intelligence can serve these values and over time improve how we work," she stated in the release.

She had more to say when interviewed by Mahedevan.

"I want to emphasize that this is a tool we can use, but does not replace the journalistic smarts, experience, expertise and ability to do our jobs in a way that connects with audiences," she said.

Despite her attempts to reassure, I'm worried.

Her use of the phrase "does not replace" concerns me, in part because technology has in fact replaced real people in jobs since the start of the Industrial Revolution. Another red flag for me is her reference to speed. Who/what will take longer to generate copy, do you think - a real person (who gets distracted and has to go the bathroom and wants a lunch break and might spend extra time searching for the perfect word) or a computer program?

Apparently I am in the minority with my concerns. Only 7 percent of those who responded to the AP survey were worried about AI displacing jobs.

I think it's all fine and good to talk about how valuable real people are, with their experience and expertise and everything - but that assumes the news organization has set quality as its top priority.

I used to work in a newsroom that went from being family owned to owned by a corporation. Instead of focusing on quality, the bottom line became the driving factor for how we operated. And what's the most expensive part of a newsroom? The reporters and editors. At one point we had no Hinsdale reporter, no District 181 reporter and no District 86 reporter, as we were holding the positions open to "manage our expenses."

I could go on and on with other examples, but I won't. Jim Slonoff and I left that paper and started this one, where being the fastest and operating the cheapest are not our primary concerns.

So I'm in no rush to figure out how to use AI in our newsroom - even if it would help me write sexier headlines.

- Pamela Lannom is editor of The Hinsdalean. Readers can email her at [email protected].

Author Bio

Author photo

Pamela Lannom is editor of The Hinsdalean