ADAMS DISCUSSES IMPACT OF AI ON ELECTIONS

Having trouble viewing this email? View it as a Web page.

Commonwealth of Kentucky

Commonwealth of Kentucky
Office of the Secretary of State

 

October 8, 2024

 

Michon Lindstrom 
Director of Communications
michon.lindstrom@ky.gov
502-234-4091

FOR IMMEDIATE RELEASE

 

ADAMS DISCUSSES IMPACT OF AI ON ELECTIONS

Adams Testifies at the Artificial Intelligence Task Force

 

Frankfort, Ky. (October 8, 2024) – Secretary of State Michael Adams provided the following testimony today the Artificial Intelligence Task Force about the potential impact of AI on elections.

 

“Good morning, members of the Committee.

 

            Thank you for inviting me to testify on the subject of the impact of artificial intelligence on elections.

 

            This reminds me of when Secretary of State Henry Kissinger supposedly asked Chinese premier Zhou Enlai, in 1972, what was the impact of the French Revolution? Zhou is said to have responded, “It’s too early to say.” It’s too early to say what the impact of artificial intelligence on elections will be, and it may be too early for some time. But this much is clear.

 

            AI has the potential for significant impact in elections. We Americans are not the only people voting this year. The European Union, India, Indonesia, Mexico, South Africa, Ukraine, the United Kingdom, and dozens of other countries either have voted or will vote in 2024. As much as I believe in American exceptionalism, we Americans have faced similar challenges to those in these countries, including extreme polarization and the spread of conspiracy theories. There is no reason to think we will get a pass on AI, either.

 

            AI is a tool, and like any tool, it can be used for good or bad. AI is not inherently benign or malignant. Undoubtedly, advances in desktop publishing, video capturing and uploading, and social media have changed politics over the past two decades. All those changes have helped Americans of all political stripes engage in political activity. All those changes also have helped stoke divisions and proliferate false information about our candidates and our elections. I agree with my fellow secretary of state, Scott Schwab of Kansas, in a recent article he co-authored about AI in elections:  

 

“[I]ts malicious use is poised to test the security of the United States by giving nefarious actors intent on undermining American democracy – including China, Iran, and Russia – the ability to supercharge their tactics. Specifically, generative AI will amplify cybersecurity risks and make it easier, faster, and cheaper to flood the country with fake content. Although the technology won’t introduce fundamentally new risks in the 2024 election – bad actors have used cyberthreats and disinformation for years to try to undermine the American electoral process – it will intensify existing risks.”

 

            Put more succinctly, an official with the Office of the Director of National Intelligence recently said, “The [intelligence community] considers AI a malign influence accelerant, not yet a revolutionary influence tool. In other words, information operations are the threat, and AI is an enabler.”

 

I believe both these officials are right. I also share the widespread view that, to quote one expert in the field (who happens to be a Kentuckian), “The greatest danger of generative AI tools on online platforms is not their capacity to generate absolute belief in fake information[.] It is that they have the capacity to generate overall distrust. If everything can be faked, nothing can be true.”

 

            This expert adds, quote, “Media coverage to date has focused on the use of AI to target political parties or officials, but it is likelier that the most significant target this year will be trust in the electoral process itself.” On that point, let me say, I am aware of and grateful for the work that Co-Chair Senator Bledsoe is already doing to protect candidates for public office, and voters, from AI-generated deepfakes. In this past legislative session, Senator Bledsoe worked in good faith with both parties, as I’ve done, to improve our election laws, and she unanimously passed an anti-deepfake bill through the State Senate.

           

Should you take up AI legislation when you return in 2025, I would encourage you to consider prohibiting impersonation of election officials. It is illegal to impersonate a peace officer, and for good reason. It should be equally illegal to impersonate a secretary of state or county clerk and put out false information, in any format, about our elections. You probably saw that, in the New Hampshire presidential primary this year, a political consultant used AI to generate a robocall with an impersonation of President Biden’s voice, urging voters not to vote in the primary, but to save their votes for November instead. The Federal Communications Commission issued a $6 million fine, but that fine was for violation of telecommunication laws, like Caller ID. The State of New Hampshire brought criminal charges under their own law that makes impersonation of a candidate a criminal offense – a misdemeanor, actually. As you look to protect candidates and voters from such practices, I urge you to consider inclusion of election officials. An impersonation of me, or my deputy secretary, or senior staff of the State Board of Elections, or a county clerk, actually could do more harm than impersonation of a candidate.

 

I believe in a limited role for government, with a proper focus on public protection, and education. In addition to protecting Kentuckians from deepfakes, I hope that next session you will take action on my request for mandatory civic education in all Kentucky school districts, so we can train our youngest Kentuckians in the needed skills of citizenship, self-government and information literacy.

 

Finally, in preparation for this hearing, my staff generated 3 AI versions of what I might say today. I’m reading the version I wrote, not a chatbot. Or am I? I’m happy to show you 3 other versions and let you be the judge. Thank you very much.”

 

 

# # #