Deputy Director of the Elections & Government Program at the Brennan Center Gowri Ramachandran
Americas Community

Are threats over the 2024 U.S. elections real or perceived and are we ready for them?

By Cesar Antonio Nucum Jr.

WASHINGTON – Election issues in the United States have not really died down even during the height of the COVID-19 pandemic especially after the believed Big Lie that the Democrats stole the 2020 election continues to be believed by a large majority of the Republican party and possibly some independents that unfortunately resulted to the “insurrection” on January 6, 2021 known as the U.S. Capitol attack by losing candidate former President Donald Trump supporters.

The projection of Trump supporters of this supposed election 2020 steal has increased the chance violence during and around the elections may take shape and this thought continues to linger and persists with the following scenarios in case:

* Candidates of one party declare victory over the protests of election administrators;

* Candidates declare fraud even before elections are held- or after, with no evidence;

* Election officials are threatened or harmed, or some refuse to count election results that go against their ideology;

* Disinformation that occurred in 2020 persists with the use of AI and Deep Fakes; and

* Violent protests are encouraged before, during, or after the Presidential elections.

To see if something is being done to mitigate if not totally avoid the threats, Ethnic Media Services conducted a briefing on What Threats Loom Over the 2024 U.S. Elections, and are We Prepared to Face Them? to discuss what are considered the most dangerous threats to the 2024 elections and what are being done about them.

Leading the panelists was Deputy Director of the Elections & Government Program at the Brennan Center Gowri Ramachandran joined by Senior Counsel and Director of Digital Justice and Civil rights at Free Press (FP) Nora Benavidez, Associate Director of the Elections Project at the Bipartisan Policy Center William T. Adler and Executive Director of Witness.org Sam Gregory.

At the briefing, Ramachandran mentioned that the key to fair elections is undisrupted voter access coupled with having the resiliency of election officials “in the event of touchscreen voting machines breaking down, or electronic poll books becoming unusable, or a breach of the voter registration database. She also does not believe that the movement to count ballots by hand does not support election integrity.

“We recommend practices like backing off the voter database well before the election, having plenty of emergency and provisional paper ballot supplies, and doing capacity testing for electronic systems,” she continued. “Small disruptions like these can be fodder for a lot of misinformation about how voters can vote, or even about the whole election being unfair.”

Poll worker shortages — which occurred in 2020 due to the pandemic, particularly given that many poll volunteers tend to be elderly — would not help but and may even be a security threat although Ramachandran cited “you also get shortages when election workers feel unsafe due to threats and harassment.”

“To help poll workers feel safe, we recommend that election officials implement security upgrades like bulletproof glass and keycard access … and make it clear that threats like malicious information and disinformation will not be tolerated,” she added.

Benavidez, on the hand, believes that social media companies play a major role in perpetuating misinformation, enabled disinformation and compromise reporting.

“Particularly since the January 6 insurrection, the biggest companies — Meta, Tik Tok, Google, YouTube, Twitter — finally seem to accept that their failure to moderate content played a role in undermining public safety and democracy,” Benavidez lamented.

Add to this, she rued that these companies laid off tens of thousands of layoffs from   over the past year, deprioritizing accountability of accurate content that point to where their values lie.

“There’s a downstream effect where mainstream media outlets, like CNN and the LA Times, often digest unverified misinformation and disinformation originating on social media … and that will have grave implications towards the next 12 months.”

To promote accurate content around the elections, Benavidez said these tech companies should reinvest in staffing teams to “moderate information and safeguard election integrity,” more efficiently moderate political ads across languages, develop increased transparency practices like data analytics reports shared with researchers, journalists and policymakers, and bolster political ad policies to prohibit content promoting misinformation about polling locations, practices or candidates.

Adler echoed the fear in shortage of election workers on the polling side too and will serve to further heighten the risk of misinformation threats to a fair 2024 election.

“Administering elections has always been a relatively thankless, low-paid government job, which has gotten increasingly complex over the past 20 years,” Adler admits. “As we’ve incorporated more technology into polling, they’ve become IT managers in the public spotlight, facing public threats … New responsibilities in how they communicate about their work are a key part of their job as wasn’t the case before 2020 or 2016.”

Adler referred to the November 2023 Reed College survey of approximately 1,000 local election officials for reasons for high turnover of poll workers aside from safety concerns. The study revealed that 31% of those surveyed said they knew other “local election officials who left their jobs because of personal safety issues and threats, 11% surveyed had considered leaving because of safety concerns, and over a third of them will be eligible for retirement before 2026.”

Adler warns that this constitutes a dangerous cycle of misinformation explaining that if “election officials face threats, they may be more inclined to leave their jobs that results in less institutional knowledge on how to run an election, which might result in more mistakes, which may in turn undermine voter confidence, which brings more threats … Delays or errors in processing ballots creates a hunger for information which misinformation peddlers are all too eager to fill.”

The threat posed by increasing amount of misinformation through artificial intelligence-generated “deepfakes” alarms Gregory as a wide range of technology has emerged whereby anyone can generate images from a text prompt, or mimic voices from audio samples, “to target and push people out of the public sphere. In conversations I’ve had with people working in electoral processes, this is something they’ve seen and worry about has gotten particularly easier over the last year.

In electoral contexts, patterns of deceptive image and audio use are already on the rise; he mentioned recent examples of an audio deepfake of Slovakian liberal politician Michal Šimečka and journalist Monika Tódová apparently discussing how to rig the upcoming elections; another audio deepfake targeting UK Labour leader Keir Starmer; and another audio deepfake of Chicago mayoral candidate Paul Vallas.

“The opposite is also true as cases when someone claims something is a deepfake when it’s actually real,” explained Gregory. “Witness.org receives many cases of deep fakes, and a lot of them are people basically relying on others’ absence of knowledge to deny a piece of audio on the basis that it has been faked.”

“Our information environment, by design, discourages engagement and conversation” across opposing political sides, he added. To combat electoral threats like these, we should “start from a baseline of what’s possible, of how misinformation is spread, so we know where to look for and stop it.”