California is testing new generative AI tools. Here’s what to know

SACRAMENTO, California — Generative artificial intelligence tools will soon be used by the California government.

Democratic Gov. Gavin Newsom’s administration announced Thursday that the state will partner with five companies to develop and test generative AI tools that can improve public services.

California is one of the first states to roll out guidelines on when and how government agencies can purchase AI tools, as lawmakers across the country grapple with how to regulate the emerging technology.

Here’s a closer look at the details:

Generative AI is a branch of artificial intelligence that can create new content on demand, such as text, audio, and photos. It’s the technology behind ChatGPT, the controversial writing tool launched by Microsoft-backed OpenAI. San Francisco-based company Anthropic, with backing from Google and Amazon, is also getting in on the generative AI game.

California is considering using this type of technology to reduce customer wait times at government agencies and improve traffic and traffic safety, among other things.

Initially, four state departments will test generative AI tools: the Department of Tax and Fee Administration, the California Department of Transportation, the Department of Public Health, and the Health and Human Services Department.

The tax and benefits agency administers more than 40 programs and took more than 660,000 calls from businesses last year, director Nick Maduros said. The state hopes to use AI to listen to these calls and retrieve key information about state tax codes in real time, allowing employees to answer questions more quickly because they don’t have to look up the information themselves.

In another example, the state wants to use the technology to provide people with information about health and social services in languages ​​other than English.

The public does not yet have access to these tools, but may do so in the future. The state will begin a six-month trial period during which the tools will be tested internally by state workers. In the tax example, the state plans to have the technology analyze recordings of calls from companies and see how the AI ​​handles them next — rather than running it in real time, Maduros said.

However, not all tools are designed to communicate with the public. For example, the tools designed to help improve highway congestion and road safety would only be used by government officials to analyze traffic data and brainstorm possible solutions.

State workers will test and evaluate its effectiveness and risks. If the tests go well, the state will consider using the technology more broadly.

The final costs are unclear. For now, the state will pay each of the five companies one dollar to start a six-month internal trial. The state can then assess whether new contracts need to be signed for long-term use of the tools.

“If it turns out it doesn’t serve the public better, we’ll lose a dollar,” Maduros said. “And I think this is a pretty good deal for the people of California.”

The state currently faces a huge budget deficit, which could make it harder for Newsom to argue that such technology is worth deploying.

Administration officials said they had no estimate of what such tools would ultimately cost the state, and they did not immediately release copies of the agreements with the five companies that will test the technology on a trial basis. These companies are: Deloitte Consulting, LLP, INRIX, Inc., Accenture, LLP, Ignyte Group, LLC, SymSoft Solutions LLC.

The fast-growing technology has also raised concerns about job losses, misinformation, privacy and automation bias.

State officials and academic experts say generative AI has significant potential to help government agencies become more efficient, but there is also an urgent need for safeguards and oversight.

Testing the tools on a limited basis is one way to limit potential risks, said Meredith Lee, chief technical advisor for UC Berkeley’s College of Computing, Data Science, and Society.

But, she added, testing cannot stop after six months. The state must have a consistent process for testing and learning about the potential risks of the tools if it decides to deploy them more widely.