© 2024 All Rights reserved WUSF
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

AI's influence on election 2024

A Senate Rules and Administration Committee hearing titled "AI and The Future of Our Elections" on Capitol Hill September 27, 2023 in Washington, DC. The hearing focused on what effect Artificial Intelligence can have on the 2024 election and future elections in America. (Photo by Drew Angerer/Getty Images)
A Senate Rules and Administration Committee hearing titled "AI and The Future of Our Elections" on Capitol Hill September 27, 2023 in Washington, DC. The hearing focused on what effect Artificial Intelligence can have on the 2024 election and future elections in America. (Photo by Drew Angerer/Getty Images)

First there was fake news on social media. Now, there’s AI, and its power to shape American politics.

“It is offering new ways of spreading disinformation, like the audio and video content, especially, but it’s mostly just turbocharging existing efforts and making it a lot cheaper and easier,” Nicole Gill, co-founder and executive director at the watchdog group Accountable Tech, says.

AI has the power to make audio and videos of people saying anything their creators want.

As the 2024 presidential election approaches, several states are trying to pass laws to stop the spread of deceitful AI generated political content. But very few have been able to do so.

Today, On Point: AI and its influence on election 2024.

Guest

Darrell West, senior fellow at the Center for Technology Innovation within the governance studies program at the Brookings Institution. Author of “How AI will transform the 2024 elections.

Nicole Gill, co-founder and executive director at the watchdog group Accountable Tech.

Also Featured

Steve Simon, Secretary of State of Minnesota.

Transcript

Part I

On October 17th, 2023, Michigan State Representative Penelope Tsernoglou gave a presentation on Michigan House Bill 5141. The presentation was sparsely attended. The meeting room was largely empty. And that’s too bad. Because Representative Tsernoglou drew back the curtain on one of the biggest new forces working its way into American politics.

She started with this.

MR. BEAST DEEPFAKE: If you’re watching this video, you’re one of the 10,000 lucky people who will get an iPhone 15 Pro for just $2. I’m Mr. Beast, and I’m doing the world’s largest iPhone 15 giveaway. Click the link below to claim yours now.

CHAKRABARTI: Now if you don’t know who Mr. Beast is, ask your kids. Because he is a YouTube superstar with more than 230 million followers.

So why did Mr. Beast have a cameo in Representative Tsernoglou’s bill presentation? She explained.

REP. TSERNOGLOU: So it turns out Mr. Beast didn’t actually do that video. That’s not him. If you click that link, something bad probably happened. You did not get a free iPhone. So that’s just one of the better examples out there.

CHAKRABARTI: It’s an example of video and audio content created by AI. To be clear, it is fake. Mr. Beast himself disavowed it, even though the voice sounds like his, and the words matched the lip movements, and the video, Mr. Beast took to social media and posted the following, quote, “Lots of people are getting this deepfake scam ad of me.

The future is going to continue to get weirder. Are social media platforms ready to handle the rise of AI deepfakes? This is a serious problem.”

So what’s the connection between a deepfake video of Mr. Beast and politics? Representative Tsernoglou then played this.

JOE BIDEN DEEPFAKE: Hi, Representative Tsernoglou it’s your buddy Joe. I really like your bill that requires disclaimers on political ads that use artificial intelligence. No more malarkey. As my dad used to say, ‘Joey, you can’t believe everything you hear, not a joke. Anyway, thank you and your committee for your leadership in the drive for more democratic elections and give your daughter a hug for me. By the way, this statement was created using artificial intelligence.’

CHAKRABARTI: Okay, be honest. Look at your radio, or your smartphone, or however you’re listening right now, and be honest. Tell me, until the very end there, did you think that was the real President Joe Biden? The tempo’s right. So is the tone. Even the papery edge to the voice.

Also, there are the Biden-esque idiosyncrasies out there. The story about his dad, and calling him Joey, and heck, even malarkey got a shout out in that AI generated content. But as Representative Tsernoglou emphasized, that was not, I repeat, not Joe Biden. It is completely AI generated, never uttered by the actual President of the United States.

TSERNOGLOU: And this audio took approximately three minutes to make.

CHAKRABARTI: Three minutes to successfully replicate a statement using the president’s voice, but a statement the president never made. Now, Representative Tsernoglou was trying to make clear, fake news has already wreaked havoc on American politics and elections.

TSERNOGLOU: Imagine what easily accessible artificial intelligence can and will do, even in this election year.

AI generated content is currently indistinguishable from real life images and sounds. The threat of AI generated content to influence and deeply impact elections and voters is imminent. Michigan can take the lead in regulating misleading election related content and protecting democracy.

CHAKRABARTI: And that is what Michigan House Bill 5141 is all about. A state level attempt to regulate and identify AI generated content that could impact American elections. Now, the bill passed in November. Michigan is not the only state to do this. But is it enough? Just how much can AI positively and negatively influence our elections?

This is On Point. I’m Meghna Chakrabarti. I really am. I’m not AI just yet. And that’s what we’re talking about today. And we’re going to start with Darrell West. He’s a senior fellow in the Center for Technology Innovation at the Brookings Institution, and he joins us from Washington, D. C. Darrell, welcome to On Point.

DARRELL WEST: Thank you very much. It’s nice to be with you.

CHAKRABARTI: First of all, are we already seeing use of AI, whether positive or negative in American politics last year and this year?

WEST: We are seeing a lot of use of AI in campaign communications. There have been fake videos, fake audio tapes, the United States is not unique.

We just had presidential elections in Slovakia and Argentina. There was misuse of the technology there. There were fake audio tapes alleging that one of the presidential candidates was corrupt in taking bribes. In Argentina, one opponent, a conservative opponent tried to turn his adversary into a Marxist with Marxist clothes, rhetoric and so on.

And so the problem is we’ve reached a point where the technology can create video and audio that sounds completely authentic. Even though it is fake.

CHAKRABARTI: How much has the technology progressed since a couple of years ago? And I ask that because Representative Tsernoglou in her presentation also played a video, a deepfake video, of the actor Morgan Freeman.

The voice was pretty close to accurate. He’s got such a singular, recognizable voice. There was a little mismatch in the lip movement. But that, she noted, that was from a couple of years ago. How indistinguishable from reality is it now, Darrell?

WEST: It’s very indistinguishable from reality. And of course, that is part of the problem.

The technology has advanced considerably just in the last six months. It used to be, if you wanted to use sophisticated AI tools, you needed some degree of a technical background. Today, because the new generative AI tools are prompt driven and template driven, anybody can use them. So we have democratized the technology at the very time that American society is highly polarized.

There’s a lot of extreme rhetoric and actions are taking place all across the political landscape. People are upset. And this is like the worst time to put this type of technology in the hands of everyone because people are expecting this to be a close race, people have incentives to do bad things with this technology.

Let’s play another couple of examples here, and I’m going to reiterate over and over again. This is AI generated audio. Okay, it is AI generated. It was not said by the actual person whose simulated voice you’re about to hear. It’s Hillary Clinton and she is purportedly supporting Florida Governor Ron DeSantis in this AI generated audio misinformation.

So here it is.

HILLARY CLINTON DEEPFAKE: People might be surprised to hear me say this, but I actually like Ron DeSantis. A lot. Yeah, I know. I’d say he’s just the kind of guy this country needs, and I really mean that. If Ron DeSantis got installed as president, I’d be fine with that. The one thing I know about Ron is that when push comes to shove, Ron does what he’s told.

And I can’t think of anything more important than that.

CHAKRABARTI: Now again, that is not Hillary Clinton, it’s an AI generated audio sample spreading misinformation about her purported support of Ron DeSantis. Here’s another example. This is from a pro DeSantis super PAC. They used AI to create an attack ad of Donald Trump’s voice disrespecting Iowa.

Now that was the first state, of course, to go to the caucuses in the 2024 presidential election cycle, which just happened. And again, to be clear, AI generated, and Trump did not say what you’re about to hear.

AI NARRATOR: Governor Kim Reynolds is a conservative champion. She signed the heartbeat bill and stands up for Iowans every day.

So why is Donald Trump attacking her?

DONALD TRUMP DEEPFAKE: I opened up the governor position for Kim Reynolds and when she fell behind, I endorsed her. Did big rallies and she won. Now she wants to remain neutral. I don’t invite her to events.

AD NARRATOR: Trump should fight Democrats, not Republicans. What happened to Donald Trump?

Never Back Down is responsible for the content of this advertising.

CHAKRABARTI: And again, that’s a pro Ron DeSantis super PAC. Darrell, I want to ask you about, we’ll return to how domestic groups might use AI in the election, but recalling 2016, there was so much concern and evidence of foreign use of fake news on social media. Are you also concerned about the same thing regarding AI?

WEST: There almost certainly is going to be misuse of AI by foreign entities.

So you mentioned Russia in 2016. In previous elections we’ve had content saying the Pope had endorsed Donald Trump, which of course never happened. When you think about the number of foreign countries that see this as a high stakes’ election, and several of them actually have a preferred candidate, oftentimes Donald Trump. Russia certainly would like Trump to win.

Russia’s had difficulty beating Ukraine on the military battlefield, but if they can elect more American politicians who are willing to cut off U.S. military assistance to Ukraine, Putin wins the war. He has a clear stake in this election. China is obviously interested. Iran, North Korea, the Saudis, even Israel.

The stakes of this election for all these foreign countries has gone up dramatically. Many of these countries have very sophisticated technology capabilities, a number of them have well developed propaganda operations, as well. So we need to make sure that there is transparency about the use of these communications.

We want the 2024 American elections to be decided by Americans and not foreign entities.

CHAKRABARTI: But as we learned from 2016, you can’t necessarily regulate your way around this. We’re going to talk in a bit about the efforts at the state level to tag AI generated misinformation or disinformation, but that has limited effectiveness, doesn’t it, Darrell?

Doesn’t the same problem apply to AI?

WEST: We can’t regulate what Russia does. We can’t regulate what China does. They are outside the American borders. And even if we tried, they of course would not pay any attention to our regulations. So voters need to be aware in this campaign about the risks facing them.

If they start to hear content that seems a little off, if the voice sounds a little tinny, if the video image seems a little shady, it probably is. And so people just need to be on guard. It’s going to be a very difficult next 10 months leading up to our general election. There’s not a whole lot we can do about the bad actors out there who are seeking to misuse these tools in order to influence the election.

This article was originally published on WBUR.org.

Copyright 2024 NPR. To see more, visit https://www.npr.org.

Tags
You Count on Us, We Count on You: Donate to WUSF to support free, accessible journalism for yourself and the community.