الخميس. ديسمبر 26th, 2024

They always promote that one of the normative goods on which democracy depends is responsible representation through fair and transparent elections.

This good is at risk when public perception of the integrity of elections is significantly distorted by false or misleading information.

The two most recent presidential elections in the U.S. were accompanied by a plethora of false or misleading information, which grew from false information about voting procedures in 2016 to the “big lie” that the 2020 election was stolen from Donald Trump, which he and his allies have baselessly and ceaselessly repeated.

Misleading or false information has always been part and parcel of political debate, and the public arguably accepts a certain amount of dishonesty from politicians.

However, Trump’s big lie differs from conventional, often accidentally disseminated, misinformation by being a deliberate attempt to disinform the public.

Scholars tend to think of disinformation as a type of misinformation and technically that is true: intentional falsehoods are but one subset of falsehoods  and intentionality does not affect how people’s cognitive apparatus processes the information.

 But given the real-world risks that disinformation poses for democracy, we think it is important to be clear at the outset whether we are dealing with a mistake versus a lie.

The tobacco industry’s 50-year-long campaign of disinformation about the health risks from smoking is a classic case of deliberate deception and has been recognized as such by the U.S. Federal Courts United States District Court, District of Columbia.

United States v. Philip Morris Inc.). This article focuses primarily on the nature of disinformation and how it can be identified, and places it into the contemporary societal context. Wherever we make a broader point about the prevalence of false information, its identifiability or its effects, we use the term misinformation to indicate that intentionality is secondary or unknown.

An analysis of mis- and disinformation cannot be complete without also considering the role of the audience, in particular when people share information with others, where the distinction between mis- and disinformation becomes more fluid. In most instances, when people share information, they do so based on the justifiable default expectation that it is true.

 However, occasionally people also share information that they know to be false, a phenomenon known as “participatory propaganda”.

One factor that may underlie participatory propaganda is the social utility that persons can derive from beliefs, even if they are false, which may stimulate them into rationalizing belief in falsehoods.

The converse may also occur, where members of the public accurately report an experience, which is then taken up by others, usually political operatives or elites, and redeployed for a malign purpose.

For example, technical problems with some voting machines in Arizona in 2022 were seized on by Trump and his allies as being an attempt to disenfranchise conservative voters.

 Both cases underscore the importance of audience involvement and the reverberating feedback loops between political actors and the public which can often amplify and extend the reach of intentional disinformation, and which can often involve non-epistemic but nonetheless rational choices.

The circular and mutually reinforcing relationship between political actors and the public was a particularly pernicious aspect of the rhetoric associated with Trump’s big lie.

During the joint session of Congress to certify the election on 6 January 2021, politicians speaking in support of Donald Trump and his unsubstantiated claims about election irregularities appealed not to evidence or facts but to public opinion.

For example, Senator Ted Cruz cited a poll result that 39% of the public believed the election had been “rigged”. Similarly, Representative Jim Jordan (R-Ohio), who is now Chairman of the House Judiciary Committee, argued against certification of the election by arguing that “80 million of our fellow citizens, Republicans and Democrats, have doubts about this election; and 60 million people, 60 million Americans think it was stolen”.

 The appeal to public opinion to buttress false claims is cynical in light of the fact that public opinion was the result of systematic disinformation in the first place.

 While nearly 75% of Republicans considered the election result legitimate on election day, this share dropped to around 40% within a few days, coinciding with the period during which Trump ramped up his false claims about the election being stolen.

By December 2020, 28% of American conservatives did not support a peaceful transfer of power, perhaps the most important bedrock of democracy. Among liberals, by contrast, this attitude was far more marginal (3%).

Public opinion has shifted remarkably little since the election. In August 2023, nearly 70% of Republican voters continued to question the legitimacy of President Biden’s electoral win in 2020. More than half of those who questioned Biden’s win believed that there was solid evidence proving that the election was not legitimate.

However, the purported evidence marshaled in support of this view has been repeatedly shown to be false. 

It is particularly striking that high levels of false election beliefs are found even under conditions known to reduce “expressive responding”—that is, responses that express support for a position but do not reflect true belief.

The entrenchment of the big lie erodes the core of American democracy and puts pressure on Republican politicians to cater to antidemocratic forces.

It has demonstrably decreased trust in the electoral system, and a violent constitutional crisis has been identified as a “tail risk” for the United States in 2024.

Similar crises in which right-wing authoritarian movements are dismantling democratic institutions and safeguards have found traction in many countries around the world including liberal democracies.

In this context, it is worth noting that the situation in other countries, notably in the Global South, may differ from the situation in the U.S.

On the one hand, low state capacity and infrastructure constraints may curtail the ability of powerful actors to spread disinformation and propaganda.

 On the other hand, such spread can be facilitated by the fact that closed, encrypted social-media channels are particularly popular in the Global South, sometimes providing an alternative source of news when broadcast channels and other conventional media have limited reach. In those cases, dissemination strategies will also be less direct, relying more on distributed “cyber-armies” than direct one-to-millions broadcasts such as Trump’s social-media posts.

The harm that can be caused by such distributed systems was vividly illustrated by the false rumors about child kidnapers shared in Indian WhatsApp groups in 2018, which incited at least 16 mob lynchings, causing the deaths of 29 innocent people.

The ensuing interplay between the attempts of the Indian government to hold WhatsApp accountable and Meta, the platform’s owner, highlights the limited power that governments in the Global South hold over multinational technology corporations.

As a result, many platforms do not even have moderation tools for problematic content in popular non-Western languages.

The power asymmetry between corporations and the Global South has been noted repeatedly, and recent calls for action include the idea of collective action by countries in the Global South to insist on regulation of platforms.

We have only scratched the surface of a big global issue that is in urgent need of being addressed.

Despite these differences between the Global North and South, beliefs in political misinformation can be pervasive regardless of regime type or development level.

By amine

اترك تعليقاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *