TALLAHASSEE The use of artificial intelligence to generate images, text and voices has the potential to muddy the waters in political campaigns and deepen distrust among voters, according to communications experts.
Generative AI, or Artificial Intelligence, allows users to input resulting prompts into generated content that can represent virtually anything the user wants. With the 2024 election looming, political media experts are gearing up for AI-generated images to start appearing much more frequently in campaign ads.
Talk about oppositional messaging, it can be created with a snap of your fingers. The prompt returns information so fast that we will be inundated with it as the election cycle really starts to heat up, said Janet Coats, chief executive officer of the University of Florida Consortium on Confidence in Media and Technology, in a recent interview with The News Service of Florida. .
The ability to manipulate images and voice audio is moving to an entirely different level of sophistication, Coats said.
We’ve been down this road for a long time, Coats said. One of the first visual commercials that today could be labeled gimmicky is from 1964, with the very famous (attack commercial), the little girl with the daisy and then the mushroom cloud superimposed on her on the screen that the (Lyndon B.) Johnson campaign ran against Barry Goldwater.
In the past, when those manipulations happened, you knew there was a human being who was manipulating the information, he added.
For the two major political figures in Florida, who are on a collision course in the 2024 Republican presidential primaries, the matter erupted in early June. A Twitter account affiliated with Governor Ron DeSantis’ presidential campaign tweeted a video that included multiple AI-generated images of former President Donald Trump hugging Anthony Fauci, Trump’s former chief medical adviser who led the pandemic response. of the administration.
With Trump and DeSantis arguing over their respective approaches to the COVID-19 pandemic, DeSantis’ camp sought to depict an intimate relationship between Trump and Fauci.
The post containing the video was slapped with a note from the Twitter community, which the social media platform says is meant to allow users to collaboratively add context to potentially misleading Tweets.
The 3 shots showing Trump hugging Fauci are images generated by artificial intelligence. The rest of the recordings and advertising images are genuine, the notice states.
The New Zealand National Party’s use of AI-generated imagery in multiple political ads made international headlines in May. Also in May came an ad attacking Republican National Committee Chairman Joe Biden’s reelection campaign, which depicted a vision of a bleak future under a second Biden term using images created by artificial intelligence.
The video is posted on the GOP’s official YouTube channel with a description explicitly letting viewers know that it incorporates AI imagery.
An AI-generated glimpse into the country’s possible future if Joe Biden is re-elected in 2024, the description states.
Catch up on Tampa Bay’s top headlines
Sign up for our free DayStarter newsletter
We will bring you the latest news and information you need to know every morning.
But not all AI images will be identified so easily.
Coats emphasized the ease with which the AI-generated images can be created by anyone with an Internet connection and the potential difficulty of pinpointing their source.
It’s a low barrier to entry, to do so. You don’t have to contract for some big expensive instrument. The tools are readily available. You don’t have to have particularly specialized knowledge to use them. The more sophisticated the prompt, the higher the quality of the output. But it’s not rocket science, Coats said.
Steve Vancore, a longtime political consultant and pollster, said that generative AI could become commonplace in an era where the volume of political ads and other communications placed in front of voters is steadily increasing.
In the bigger picture, what should be concerning is that the public already has an intrinsic distrust of political communications. And as a result of that, we’ve seen an escalation of the arms race in the amount of communications in the races, Vancore told the News Service.
As the volume of political ads increases, more use of AI-generated images, voices, and text is likely to follow.
There is so much at stake, the people running these campaigns will only use it to raise more money and to use more of it. And so it will be an unfortunate arms race that will create a higher degree of public distrust, Vancore said.
Vancore, which has been involved in more than 250 campaigns over its decade-long career, said its advice to candidates about using AI technology in ads depends on how it would be used.
My standard for political attack ads, negative ads is: it’s truthful, it’s verifiable, and it’s relevant, Vancore said.
Vancore used an example of a candidate using the AI tool ChatGPT, which generates text, to build emails for their constituents.
To say, Hey, I want a series of emails talking about my after school counseling program for kids. This is a perfectly acceptable use of artificial intelligence, Vancore said. What is not an acceptable use of AI is, Hey, I want you to generate some images of my opponent dating underage girls.
Whether AI-generated ads can hurt a candidate’s credibility also depends on how they’re used, Vancore said, adding that other uses of the technology could be more subtle.
One knock on Joe Biden is that he’s old. Not an unfair rap, maybe. It’s a legitimate concern that the most powerful person on earth, or one of the people, maybe is getting old, right? What if Joe Biden’s campaign had subtly aged him a little? He showed him treading a little more cautiously, responding a little quicker, Vancore said.
Trump, Vancore said, has the same problem.
You can see that he (Trump) is getting older, it probably has a lot to do with what’s going on in his life. But, if someone were to age it a bit, would that do the press as well? Would the press even consider it and will this come back? he said.
Coats also pointed to the possibility of AI being used to clean up candidate images.
There is the potential to muddy the waters, not just to create announcements of attack or misinformation about your opponent, but to try and clean yourself up. It’s an octopus. There are so many ways that I don’t think we’ve even thought about how it could be distributed, Coats said.
According to Jay Hmielowski, an associate professor of public relations at UF, applicants who receive ads that use AI-generated imagery don’t have to use new methods to combat the attacks.
For example, applicants can use programs designed to detect the use of AI-generated imagery, said Hmielowski, a political communications specialist.
You can use that and say, look, we ran it through this detector, and that clearly shows that this is not our candidate saying that. Also, here’s the actual video of what happened at this event, he said. So, you’d be doing the same things you’ve always done. Push against it with, here’s what actually happened, here’s the facts about it. And then you hope that reaches the population of people who are willing to listen to things beyond their own sort of political bubbles, he said.
By Ryan Dailey
#Worrying #Ads #Escalate #Political #Drama
Image Source : www.tampabay.com