Generative AI platforms have seen a significant rise in recent years especially with the
extensive use of ChatGPT beginning in late 2022. These AI platforms take user input requests and return anything from simple text to fabricated videos utilizing deep learning algorithms and language models. With the United States presidential election approaching, generative AI’s widespread accessibility raises notable issues that voters should be aware of.
The impact of Generative AI has already been exemplified in political elections across the globe where AI-generated videos, images, and audio falsely display political figures and contribute to the spread of false information. An instance of this occurred recently in the Turkish elections. According to the article, “Fact check: Turkey's Erdogan shows false Kilicdaroglu video”, written in March 2023 by DW News, the current president, Recep Tayyip Erdoğan, utilized AI to develop a fake video that connected his presidential opponent, Kemal Kiliçdaroğlu, to the leader of a classified terrorist organization, PKK. Research performed by DW’s fact-checking team and Turkish service showed that this video was “manipulated by combining two separate videos with totally different backgrounds and content” (Ünker, 2023). This is just one instance exemplifying how the wave of Generative AI allows individuals the possibility to more easily create fraudulent evidence regarding false claims by political figures to skew voters’ choices.
Even in the 2024 United States presidential election, there has already been one instance where an AI-generated robocall mimicked President Joe Biden’s speech and voice to directly tell New Hampshire voters to refrain from voting in the New Hampshire primary. Recent findings announced by NBC News in their article, “Democratic operative admits to commissioning fake Biden robocall that used AI”, written this past February confirmed that this call was forged as Steve Kramer, a veteran political consultant working for a rival candidate, confessed to fabricating this audio with AI technology. Similarly, on the other side of this election, BBC News announced that Trump supporters used AI to create a deepfake of Trump with a group of black voters. The article, “Trump supporters target black voters with faked AI images”, states the purpose of creating this false perception was to encourage African Americans to vote for him in the elections. The use of generative AI in this setting detours political elections by spreading false claims and information that creates perceptions of political figures in voter’s minds.
To ensure fair voting procedures and decrease the negative impact of generative AI, big technology companies including Google and X revealed an agreement they signed that aims at mitigating the risk of AI disturbance in the 2024 elections. NPR’s article, “Tech giants pledge action against deceptive AI in elections”, published on February 16, 2024, discusses that this agreement consists of working together to detect any AI-generated forgery, assess AI models and software to determine risks, develop watermark technology, and support efforts to educate the public about AI misinformation. Rather than implementing a strict ban on AI-generated creations, finding ways to close loopholes in manipulation media policies in social media is the approach that big tech companies find to best attack this issue.
Generative AI is an innovative solution that is used almost everywhere, allowing businesses and individuals to create content of their choice with a simple description. According to JP Morgan, this emerging technology has the potential to greatly increase workforce productivity and open doors to new business models. Despite this potential, people need to be cautious of Generative AI misuse. Voters need to be careful when exposed to information on social media or the news. It is vital to check with multiple, reliable sources regarding various political incidents and be aware of potential AI-generated content prior to making a decision in the 2024 presidential election.
Comments