Election Workers Are Drowning in Records Requests. AI Chatbots Could Make It Worse

Election Workers Are Drowning in Records Requests. AI Chatbots Could Make It Worse

Many US election deniers have spent the past three years inundating local election officials with paperwork and filing thousands of Freedom of Information Act requests in order to surface supposed instances of fraud. “I’ve had election officials telling me that in an office where there’s one or two workers, they literally were satisfying public records requests from 9 to 5 every day, and then it’s 5 o’clock and they would shift to their normal election duties,” says Tammy Patrick, CEO of the National Association of Election Officials. “And that’s untenable.”

In Washington state, elections officials were receiving so many FOIA requests following the 2020 presidential elections about the state’s voter registration database that the legislature had to change the law, rerouting these requests to the Secretary of State’s office to relieve the burden on local elections workers.

“Our county auditors came in and testified as to how much time having to respond to public records requests was taking,” says democratic state senator Patty Kederer, who cosponsored the legislation. “It can cost a lot of money to process those requests. And some of these smaller counties do not have the manpower to handle them. You could easily overwhelm some of our smaller counties.”

Now, experts and analysts worry that with generative AI, election deniers could mass-produce FOIA requests at an even greater rate, drowning the election workers legally obligated to reply to them in paperwork and gumming up the electoral process. In a critical election year, when elections workers are facing increasing threats and systems are more strained than ever, experts who spoke to WIRED shared concerns that governments are unprepared to defend against election deniers, and generative AI companies lack the guardrails necessary to prevent their systems from being abused by people looking to slow down election workers.

Chatbots like OpenAI’s ChatGPT and Microsoft’s Copilot can easily generate FOIA requests, even down to referencing state-level laws. This could make it easier than ever for people to flood local elections officials with requests and make it harder for them to make sure elections run well and smoothly, says Zeve Sanderson, director of New York University’s Center for Social Media and Politics.

“We know that FOIA requests have been used in bad faith previously in a number of different contexts, not just elections, and that [large language models] are really good at doing stuff like writing FOIAs,” says Sanderson. “At times, the point of the records requests themselves seem to have been that they require work to respond to. If someone is working to respond to a records request, they’re not working to do other things like administering an election.”

WIRED was able to easily generate FOIA requests for a number of battleground states, specifically requesting information on voter fraud using Meta’s LLAMA 2, OpenAI’s ChatGPT, and Microsoft’s Copilot. In the FOIA created by Copilot, the generated text asks about voter fraud during the 2020 elections, even though WIRED provided only a generic prompt, and didn’t ask for anything related to 2020. The text also included the specific email and mailing addresses to which the FOIA requests could be sent.

When asked about whether they had put guardrails in place to keep their tools from being abused by election deniers, Caitline Roulston, director of communications at Microsoft, said the company was “aware of the potential for abuse and [has] detections in place to help prevent bots from scraping our services to create and spread spam.” Roulston did not elaborate as to what those measures were, or why Copilot specifically generated a FOIA request asking about voter fraud in the 2020 elections. Google’s Gemini would not return a FOIA request. OpenAI did not respond to a request for comment. Meta did not respond to a request for comment.

With AI generated content, it can be very difficult to tell what has been generated by a chatbot and what hasn’t. But as part of the new law in Washington state, government officials are allowed to “deny a bot request,” meaning a request for “public records that an agency reasonably believes was automatically generated by a computer program or script” and that it believes would disrupt its functions.

“I think it’s safe to say that most state and local governments are underfunded and lack the tools to be able to identify when a request is coming from a real person or if it’s AI generated or otherwise,” says Rebecca Green, codirector of the election law program at William and Mary Law School. “Without those tools, and depending on what the state laws require, local officials are really left hanging in the wind to figure out whether they comply with the law or whether a request has not been generated by a human.”

Last year, several companies, including Microsoft, OpenAI, and Google, voluntarily pledged to develop a system of watermarking as a way to designate content that has been created by AI. Watermarking for text-based outputs could, for instance, require the chatbot to use a certain word more frequently than is statistically normal to allow a computer to recognize that it was AI-generated. But many of these systems are still nascent, and watermarking a chatbot’s outputs is only useful if government and local officials have the technology and training necessary on their end to properly scan for and identify the watermark.

“We don’t have the luxury of time to figure this out if we want safe and secure elections,” Green says.

David Levine, senior elections integrity fellow at the Alliance for Securing Democracy and a former county elections director, says even though bad actors could abuse chatbots, they can still be useful tools to help people with legitimate inquiries navigate the public records process. “People ought to be able to get access to information,” he says. “And you could imagine scenarios where people in good faith are trying to figure out how to get information that allows them to have greater awareness of how elections operate.”

But, Levine says, understanding which requests are in good faith and which aren’t is difficult for local officials, particularly because they often have a legally mandated amount of time in which they must respond to records requests.

“I think the position election officials now find themselves in is—I’ll just say it—if Mike Lindell and his affiliated associates want to try and do a functional DDoS-style attack for FOIA, this is something that election officials have to be at least aware of and trying to plan for,” says Levine.

FOIA requests are just one way election deniers go after election workers. In response to former president Donald Trump’s false claims that the 2020 presidential election was rigged, election workers across the country faced a deluge of violent threats and intimidation; local elections officials are still targeted. A law passed earlier this year, also in Washington state, makes it a felony to harass election workers. Kederer, the democratic representative, says that the threats to elections officials spurred state lawmakers to allocate more money for local auditors to “beef up security, if they feel they need to do that.”

The threats and scrutiny have, in some cases, driven elections workers to quit. A report released this week from the Bipartisan Policy Center found that turnover for election workers has increased since 2020.

“There’s real concern, at a time when many election officials are leaving the field and being replaced by inexperienced officials, that those folks are just trying to get up to speed as they get closer to election day,” says Levine. Dealing with extensive FOIA requests makes that job even harder. “It’s one thing to respond to a bevy of FOIA requests well in advance of an election. It’s a whole different story to be responding to a flood of FOIA requests on election day or perhaps during early voting.”

https://www.wired.com/feed/rss

Vittoria Elliott

Leave a Reply