PHILADELPHIA — The campaign team behind Philadelphia’s embattled sheriff acknowledged Monday that a series of positive “news” stories posted to their site were generated by ChatGPT.
Sheriff Rochelle Bilal’s campaign removed more than 30 stories created by a consultant using the generative AI chatbot. The move came after a Philadelphia Inquirer story Monday reported that local news media could not find the stories in their archives.
Experts say this type of disinformation can erode public trust and threaten democracy. Bilal’s campaign said the stories were based on real events.
“Our campaign provided the talking points from third-party advisors which were then provided to the AI service,” the campaign said in a statement. “It is now clear that the artificial intelligence agency generated fake news articles in support of the initiatives that were part of the AI prompt.”
Large language models like OpenAI’s ChatGPT work by repeatedly predicting the most plausible next word in a sentence. That makes them good at completing challenging tasks in seconds, but it also causes them to frequently make mistakes known as hallucinations.
Many Americans have started using these tools to write business emails, website copy, and other documents faster. But that can lead to problems if they don’t prioritize accuracy or carefully fact-check the material.
For example, two lawyers had to apologize to a Manhattan federal court judge last year after they used ChatGPT to search for legal precedents and didn’t immediately notice that the system had made something up.
Mike Nellis, founder of the AI campaign tool Quiller, called the campaign consultant’s use of AI “completely irresponsible.”
“It’s unethical,” he said. “It’s straight up lying.”
But he said OpenAI is responsible for enforcing its policy, which currently does not allow politicians to use ChatGPT for campaigns. OpenAI did not immediately respond to a request for comment.
Nellis said local, state and federal regulation of AI tools in politics is also needed as the technology advances. Although bipartisan discussions in Congress have emphasized the need for such legislation, no federal law has yet been passed.
The Bilal story list, which the site called its “Record of Accomplishments,” ended with a disclaimer — which the Inquirer called new — that the information makes “no representation or warranty of any kind” as to the accuracy of the information.
Some, including a fired whistleblower in Bilal’s office, fear such misinformation could confuse voters and contribute to continued mistrust and threats to democracy.
“I’m very concerned about that,” said Brett Mandel, who briefly served as her chief financial officer in 2020 and spoke before the campaign issued the statement.
“I think at the local and national level we have seen not only a disregard for the truth and the institutions that we thought were the gatekeepers to the truth,” he said, “but I think we have lost all of our reliance on this have eroded the area. ”
Mandel has filed one of several whistleblower lawsuits against the office. He claimed he was fired because he had become concerned about the office finances. Bilal has been criticized during her tenure over office expenses, campaign finance reports, the reported loss of hundreds of weapons and other issues.
The list of news stories, including alleged publication dates, attributes four news stories to the Inquirer, none of which are in the newspaper’s archives, spokesman Evan Benn said. The others were attributed to three local stations: WHYY, WCAU and KYW.
___
Swenson reported from New York.
___
The Associated Press receives support from several private foundations to improve its explanatory reporting on elections and democracy. See more about AP’s democracy initiative here. The AP is solely responsible for all content.