No Comments

Crisis on the Horizon: Unmasking the Imminent Threats to the 2024 U.S. Elections – Are We Ready?

Crisis on the Horizon: Unmasking the Imminent Threats to the 2024 U.S. Elections – Are We Ready?

Magazine, The Immigrant Experience

The 2024 U.S. elections are under scrutiny as the democratic process has increasingly become a contentious arena. The persisting belief in the false narrative that the 2020 election was stolen by Democrats has created a volatile atmosphere, with the likelihood of violence being justified during and around the upcoming elections. The potential scenarios of candidates prematurely declaring victory, alleging fraud without evidence, threatening or harming election officials, and the rise of disinformation fueled by AI and Deep Fakes are ominous threats that loom over the electoral landscape.

An Ethnic Media Services (EMS) briefing, addresses these threats and features prominent speakers in the field. Among them is Gowri Ramachandran, Deputy Director of the Elections & Government Program at the Brennan Center. Ramachandran emphasizes the importance of local election officials doubling down on efforts to secure election infrastructure and ensure voter access. Highlighting past recommendations, she emphasizes the need to detect and recover from disruptions such as equipment failures, poll worker shortages, and issues with voter registration databases.

Election worker shortages, a concern amplified by the challenges faced during the pandemic, are also addressed. Ramachandran suggests proactive measures, such as laws protecting election workers from doxing and physical security assessments, to mitigate threats and harassment. Drawing attention to a case in Georgia, she notes legal actions taken against those who threatened election workers, signaling a commitment to holding individuals accountable.

Disinformation emerges as another significant threat, with the potential to confuse voters and disrupt the electoral process. Ramachandran advocates for election officials to promote accurate information in multiple languages and through diverse media channels, pre-emptively countering disinformation. Acknowledging the role of AI in exacerbating disinformation, she underscores that established security practices, like using.gov domains and implementing multi-factor authentication, remain effective countermeasures.

As the briefing unfolds, it becomes clear that addressing these multifaceted threats requires a comprehensive approach. The insights shared by the speakers lay the groundwork for a proactive and resilient electoral system, emphasizing the critical role of election officials in safeguarding the democratic process in the face of evolving challenges.
Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights at Free Press (FP), takes the virtual stage to provide insights into the challenging landscape of social media platforms. Expressing gratitude for the opportunity to contribute to the discussion, she acknowledges the complexity of the issues at hand and offers a sobering perspective on the current social media climate.

Benavidez focuses on the threats anticipated in the coming months and the next 15 months, emphasizing the need for proactive measures. She commends the forward-thinking nature of the briefing, highlighting a common flaw in social media companies’ approach to election integrity efforts – often initiated too close to the election, rendering them insufficient.
Discussing the aftermath of the January 6th insurrection, Benavidez underscores the acknowledgment by major tech companies like Meta, TikTok, Twitter, YouTube, and Google’s search engine of their failure to moderate content, which played a role in undermining public safety and democracy. She points to instances of social media toxicity during conflicts, such as the Israel-Gaza situation, and highlights Twitter’s unique challenges following Elon Musk’s acquisition.
Benavidez details the drastic changes on Twitter, including the removal of accountability measures, ethical teams, and design features, turning the blue checkmark status into a subscription-based model. She raises concerns about bad actors exploiting these changes to disseminate graphic, violent, and misleading content. The broader trend of layoffs in trust and safety workers across major platforms adds to the challenges.

Transitioning to recommendations, Benavidez outlines six key measures for platform integrity protections. These include reinstating and bolstering teams for safeguarding election integrity, launching 2024-specific interventions ahead of the U.S. primaries, holding VIP accounts to the same standards, efficient review and enforcement of political ads across languages, and developing improved transparency and disclosure practices.

In conclusion, Benavidez addresses journalists and the media’s role in navigating this landscape. She advises meeting voters where they are, engaging in online spaces, covering election-related stories proactively, focusing on the voting process, and debunking misinformation. She stresses the importance of media literacy and encourages journalists to report denominators, providing context on the scale of incidents to mitigate the potential impact of isolated incidents on public perception. Through her insights, Benavidez sheds light on the multifaceted challenges and potential solutions in the realm of social media and its implications for electoral processes.

William T. Adler, Associate Director of the Elections Project at the Bipartisan Policy Center, steps into the discussion, expressing gratitude for the opportunity to contribute to the briefing. He acknowledges the insightful commentary from previous speakers, Gallery and Nora, noting that his remarks will align with many of their observations.
Adler brings attention to the imminent commencement of the election season, with the Iowa caucuses merely two months away and the first presidential primary in New Hampshire following shortly after. He outlines the Bipartisan Policy Center’s focus on formulating durable policy solutions by prioritizing the concerns of election officials, drawing from bipartisan task forces across the country.

Highlighting the significance of on-the-ground insights from election officials, Adler reflects on a recent blog post outlining key threats to the election administration’s nuts and bolts. He emphasizes the importance of journalists being aware of these issues as they cover the upcoming elections.

The first threat Adler delves into is election official turnover, echoing concerns raised by previous speakers. He provides additional context, underscoring the increasing complexity of election administration over the past two decades, with officials now shouldering IT management responsibilities due to technological integration. The potential loss of institutional knowledge due to turnover becomes a critical concern, impacting the resilience of election procedures.
Adler references a recent survey from Reed College, revealing alarming statistics related to election official turnover. Safety concerns, threats, and misinformation emerge as key challenges faced by these officials, creating a potentially dangerous cycle that could undermine voter confidence.

Moving on, Adler emphasizes the role of misinformation in the broader context of election threats. He notes that misinformation is intertwined with other challenges discussed, and its spread often follows perceived mistakes in the election process. Addressing journalists, he urges them to be aware that mistakes are a natural part of every election and emphasizes the need to consider the scale and context of these errors.

Adler introduces reasons for hope, citing the professionalization of the election field in recent years and the potential for incoming election officials to bring valuable experience. However, he stresses the importance of media coverage during the period after voting ends, cautioning against the spread of misinformation in the information void that follows.
Adler urges journalists to familiarize themselves with the specifics of the election processes in the jurisdictions they cover, emphasizing the importance of understanding details such as paper trails, machine testing, ballot counting methods, and the possibility of post-election audits. This knowledge, he argues, will empower journalists to report accurately when unforeseen challenges arise during the election.

Sam Gregory, Executive Director of Witness.org, opened his remarks by acknowledging that he is not an expert on elections, setting the stage for a discussion on deepfakes, AI, and their implications, particularly for journalists. Gregory outlined his intention to cover what is currently known about deepfakes, AI predictions for the upcoming year, and responses available to journalists, all within the context of legislative developments.

In the realm of technical shifts, Gregory emphasized a change in the landscape over the past year. He noted that while fears about deepfakes have often surpassed reality, recent advancements have increased ease in two specific areas: image generation and audio manipulation. Image generation has become notably easier, allowing for the creation of realistic images from text prompts using widely available tools. Similarly, improvements in audio manipulation tools have made it more accessible, enabling the creation of mimic voices from text inputs.

Gregory touched upon the creation of synthetic avatars resembling humans, emphasizing that while it’s relatively easy to generate such avatars, producing real-life animated videos in various settings remains challenging. He underscored the importance for journalists to have a nuanced understanding of what is technically feasible in the context of deepfakes.
Moving to the misuse of these tools, Gregory highlighted the consistent use of deepfakes since 2018 to target women with non-consensual sexual deepfakes, posing threats to individuals both in public and private life. He stressed the ease with which such threats can be carried out, emphasizing the need for awareness among journalists. Gregory also pointed to global patterns in image generation, citing instances in the Israel-Gaza conflict and other elections where meme generation has become prevalent.

Discussing audio manipulation, Gregory noted instances of deceptive audio being used in electoral contexts, targeting prominent figures. He warned of the potential for “is it, isn’t it” claims, where realistic content is falsely labeled as fake or vice versa, contributing to misinformation.

In terms of solutions, Gregory touched upon legislative possibilities, noting potential developments in non-consensual sexual deepfake legislation, labeling political ads and AI-generated content, and rulemaking by the Federal Election Commission. However, he expressed skepticism about comprehensive legislative solutions in the near term.
Addressing journalists, Gregory acknowledged the limited availability of reliable technical solutions. He discussed two broad categories of technical approaches: media indicators that signal AI generation and detection tools. He cautioned that current detection tools are not widely available and may produce unreliable results, urging journalists to be cautious in using and promoting such tools.

Gregory advised journalists to begin their investigative process with source verification, media literacy, and community grounding. He emphasized the importance of media literacy practices and urged journalists to scrutinize media claiming to be deepfakes, noting that some may be miscontextualized existing media. Gregory concluded by recommending a stepwise approach to analysis, including reverse video searches, glitch analysis, metadata analysis, and potentially AI detection tools if they become more reliable in the future. He also suggested seeking assistance from media forensics experts in public universities.
#ElectionSecurity #2024Elections #DeepfakeDangers #DemocracyAtRisk #MediaLiteracy #ThreatAlert #PoliticalIntegrity #USVotes #StayInformed #PreparednessChallenge

You might also like

More Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed

Menu