Wyoming reporter caught using artificial intelligence to create fake quotes and stories

HELENA, Mont. — A quote from the governor of Wyoming and a local prosecutor were the first things that seemed a little strange to Powell Tribune reporter C.J. Baker. Then it was a few sentences in the stories that seemed almost robotic to him.

That a reporter for a competing news outlet was using generative artificial intelligence (AI) to write his stories was evident in a June 26 article that named comedian Larry the Cable Guy as grand marshal of the Cody Stampede Parade.

“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved characters,” the Cody Enterprise reported. “This structure ensures that the most critical information is presented first, making it easier for readers to quickly understand the key points.”

After some digging, Baker, who has been a reporter for more than 15 years, met Aaron Pelczar, a 40-year-old new to journalism who, Baker says, used AI in his stories before he left Enterprise.

The publisher and editor of the Enterprise, which was co-founded in 1899 by Buffalo Bill Cody, have since apologized and promised to take steps to ensure this never happens again. In a editorial In an article published Monday, Enterprise editor Chris Bacon said he “didn’t notice” the AI ​​copy and the false quotes.

“It doesn’t matter that the false quotes were the apparent mistake of a harried, novice reporter who trusted AI. It was my job,” Bacon wrote, apologizing for “letting AI insert words that were never spoken into stories.”

Journalists to have their careers were derailed Through make up quotes or facts in stories long before AI came into existence. But this latest scandal illustrates the possible pitfalls And dangers that AI is shaping many industries, including journalism, as chatbots can produce inaccurate but somewhat credible articles with just a few questions.

AI has found a role in journalism, including in the automation of certain tasks. Some newsrooms, including The Associated Press, are using AI to free up reporters for more impactful work, but most AP staffers are not allowed to use generative AI to create publishable content.

The AP has used technology to help with financial earnings stories since 2014, and more recently for some sports stories. It also is experimenting with an AI tool to translate some stories from English to Spanish. At the end of each such story is a note explaining the role of technology in its production.

It is important to be open about how and when AI is used. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were presented as written by reporters who did not actually exist. After the story broke, SI said it would fire the company that produced the articles for its website, but the incident damaged the reputation of the once-powerful publication.

In his Powell Tribune article breaking the news about Pelczar’s use of AI in articles, Baker wrote that he had an awkward but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Of course, I never intentionally tried to misquote anyone” and promised to “correct them and apologize and say they were incorrect statements,” Baker wrote, noting that Pelczar insisted that his mistakes should not be blamed on his Cody Enterprise editors.

After the meeting, Enterprise launched a full review of all the stories Pelczar had written for the paper in the two months he worked there. They found seven stories that contained AI-generated quotes from six people, Bacon said Tuesday. He is still reviewing other stories.

“They are very credible quotes,” said Bacon, who noted that people he spoke to during his review of Pelczar’s articles said the quotes sounded like something they would say, but that they had never actually spoken to Pelczar.

Baker reported that seven people told him they had been quoted in Pelczar’s stories, but that they had never spoken to him.

Pelczar did not respond to an AP telephone message left at a number asking to discuss what happened. Bacon said Pelczar declined to discuss the matter with another Wyoming newspaper that reached out.

Baker, who reads the Enterprise regularly because it is a competitor, told the AP that a combination of sentences and quotes in Pelczar’s stories raised his suspicions.

Pelczar’s story about a shooting in Yellowstone National Park included the following sentence: “This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene environments.”

Baker said the sentence sounded like the summaries of his stories that a certain chatbot seems to generate, in that there is a sort of “life lesson” added at the end.

Another story — about a poaching conviction — included quotes from a conservationist and a prosecutor that sounded like they came from a press release, Baker said. But there was no press release, and the agencies involved didn’t know where the quotes came from, he said.

Two of the stories questioned contained fake quotes from Wyoming Gov. Mark Gordon, which his staff only discovered when Baker called them.

“In one instance, (Pelczar) wrote a story about a new OSHA rule that included a quote from the governor that was completely fabricated,” Michael Pearlman, a spokesman for the governor, said in an email. “In a second instance, he appeared to fabricate part of a quote and then combined it with part of a quote that was included in a press release announcing the new director of our Wyoming Game and Fish Department.”

The most obvious AI-generated text was the story about Larry the Cable Guy, which ended with an explanation of the inverted pyramid, the basic approach to writing a breaking news story.

Creating AI stories isn’t hard. Users can feed a criminal affidavit into an AI program and ask it to write an article about the case, including quotes from local officials, said Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, the leading journalism think tank.

“These generative AI chatbots are programmed to give you an answer, regardless of whether that answer is complete nonsense or not,” Mahadevan said.

Megan Barton, publisher of Cody Enterprise, wrote an editorial describing AI as “the new, advanced form of plagiarism, and in the realm of media and writing, plagiarism is something that every media outlet has had to correct at one point or another. It’s the ugly part of the job. But a company that’s willing to correct these mistakes (or literally write them down) is a reputable company.”

Barton wrote that the newspaper has learned its lesson, has a system in place to recognize AI-generated stories and will “have longer conversations about how AI-generated stories are not acceptable.”

The Enterprise had no AI policy, partly because it seemed obvious that journalists shouldn’t use it to write stories, Bacon said. Poynter has a template allowing news organizations to develop their own AI policies.

Bacon expects to have one in place by the end of the week.

“This will be a topic of discussion before hiring,” he said.