SAN FRANCISCO, CA – Recent vast advancements in the world of artificial intelligence (AI) technology have experts quaking in their boots over the implications it could have in upcoming elections – particularly the presidential election in 2024 – as cheap and powerful AI tools can now be used to eerily reproduce pitch-perfect cloned voices and photorealistic video recreations of human beings that can be then used to spread vast amounts of misinformation across social media.
This fake and digitally created content, experts say, can shockingly be produced in seconds at very little cost, and subsequently disseminated across social media to deceive voters, targeting specific audiences in an effort to sway election results.
Generative AI, according to cybersecurity firm ZeroFox Vice President A.J. Nash, has quickly crept up on an unsuspecting public, and can be used to quickly generate campaign emails, texts, and videos, often to impersonate candidates to generate election misnformation on a scale never seen before.
“We’re not prepared for this,” Nash said. “To me, the big leap forward is the audio and video capabilities that have emerged. When you can do that on a large scale, and distribute it on social platforms, well, it’s going to have a major impact.”
Among specific examples of the harm that these new AI tools can create are automated robocall messages that flawlessly impersonate a political candidate’s voice, giving false information to voters such as incorrect dates to cast ballots; a candidate falsely confessing to a crime or making faux racist statements; and depictions of interviews that never took place.
AI can also be used to create fake news reports or impersonate public figures and celebrities that can also spread falsehoods, warned Oren Etzioni, CEO of the Allen Institute for AI.
“What if Elon Musk personally calls you and tells you to vote for a certain candidate?” he said. “A lot of people would listen. But it’s not him.”
Legislation has been introduced in Congress that, if passed, would require candidates to identify political ads they have created with AI tools, and to include watermarks on synthetic images to denote them as such. However, this would not figure in 3rd party actors and hostile foreign countries from using this technology to sway U.S. elections, already casting the shadow of doubt over the results of the 2024 presidential race and beyond.