IGF 2022 Open Forum #108 Combatting Disinformation without Resorting to Online Censor

Time
Wednesday, 30th November, 2022 (10:50 UTC) - Wednesday, 30th November, 2022 (11:50 UTC)
Room
Caucus Room 11

Round Table - U-shape - 60 Min

Description

Online disinformation has entered the global agenda as a major threat to public safety, security and democratic stability of nations, which has been exacerbated by the COVID-19 pandemic. The UN Secretary General Mr. Antonio Guterres on April 14, 2020 called to jointly address the “pandemic of misinformation” and to “reject the lies and nonsense out there”. The first UN General Assembly Resolution titled Global Media and Information Literacy Week was adopted unanimously on 25 March 2021, demonstrating UN Member States’ support. The UNESCO remains a leading UN agency on Media and Information Literacy. Addressing the online disinformation threat requires upholding human rights, including the freedom of expression. A tendency for implicit or explicit online censorship to be used by some governments as a way to address real or presumed threats posed by disinformation is of particular concern. Moreover, despite of the growing challenge, technology companies have failed to build in sufficient free speech safeguards into their policies and procedures to combat falsehoods on their platforms. Safeguarding information space and upholding the free speech is particularly important in the case of small or the so called “low density” languages. Referring to the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Irene Khan, the responses by States and companies to disinformation “have been problematic, inadequate and detrimental to human rights”. The idea at the core of this session is simple: disinformation can and should be addressed without resorting to censorship, bans or internet shutdowns. A holistic approach to combatting disinformation that relies on multifaceted policies and takes a long-term view of the challenge is more effective, and also rights-compatible. Such approach: introduces a nimble conceptual framework to define the challenge; starts with a critical examination of the true causes and context of disinformation and the risks it presents to societies; strives to support a free, thriving and diverse online environment resilient to the negative impact of disinformation; focuses on free, independent and pluralistic media, media and digital literacy, better public communication etc. When the use of regulatory tools is indeed warranted, the object of the regulation is not the speech or media content itself, but rather the responsibility of platform providers for averting societal risks caused by disinformation. This approach requires agility, knowledge and resources. Yet, positive experiences with designing and implementing effective policies already exist and deserve to be shared widely. The speakers are invited to address the following questions: • What is the state of the art conceptual thinking on disinformation that is relevant for practical policymaking? E.g., what are the implications of distinguishing between misinformation, disinformation and information manipulation? • What are the benchmark developments and positive examples in regulatory frameworks that tackle disinformation while protecting free speech online? • What can be done to make sure that all countries can mobilize the knowledge and resources needed to address disinformation with effective, rights compatible policies? We expect the attendees to leave this session with the following takeaways: a better overview of the conceptual and policy solutions available to combat disinformation; inspiration to advocate a holistic approach to disinformation that safeguards free speech online.

Organizers

LATVIA (Ministry of Foreign Affairs)

Speakers

Ms Anna Oosterlinck, Head of the UN team, Article 19; Mr Allan Cheboi, Senior Investigations Manager, Code for Africa; Mr Rihards Bambals, Head of Strategic Communications Coordination Department, the State Chancellery of Latvia; Mr Lutz Guellner, Head of Strategic Communication, Task Forces and Information Analysis Division, European External Action Service; Ms Melissa Fleming, Under Secretary General of the United Nations for Global Communications.

Onsite Moderator

Viktors Makarovs, Special Envoy on Digital Affairs, Ministry of Foreign Affairs of the Republic of Latvia

SDGs

16. Peace, Justice and Strong Institutions

Targets: Harmful online content such as misinformation and disinformation can impede the attainment of multiple SDG goals, including promotion of peaceful and inclusive societies and building effective, accountable and inclusive institutions at all levels. At the same time, policy solutions to the challenge of disinformation need to be compatible with human rights and fundamental freedoms, including the ability for users to express themselves freely online. The discussion will help promote policies better attuned to the SDG targets under Goal 16, especially to “ensure public access to information and protect fundamental freedoms”.

Session Report (* deadline 26 October) - click on the ? symbol for instructions

Combatting Disinformation without Resorting to Online Censorship – Open Forum organised by LATVIA

Date, time, venue:

30 NOV 2022, 10:50 UTC, Caucus Room 11

Moderator:

Viktors Makarovs, Special Envoy on Digital Affairs, Ministry of Foreign Affairs of the Republic of Latvia

Speakers:

Ms Anna Oosterlinck, Head of the UN team, Article 19; Mr Allan Cheboi, Senior Investigations Manager, Code for Africa; Mr Rihards Bambals, Head of Strategic Communications Coordination Department, the State Chancellery of Latvia; Mr Lutz Guellner, Head of Strategic Communication, Task Forces and Information Analysis Division, European External Action Service; Ms Melissa Fleming, Under Secretary General of the United Nations for Global Communications.

Main information about the panel:

Disinformation is a major threat to public safety, security and democratic stability. Governments around the world fight disinformation in different ways. Right now we see a particularly concerning trend for online censorship by some governments as a way to address real or presumed threats posed by disinformation. By applying censorship, governments take away or limit citizens’ freedom of speech and expression. Online censorship also often goes hand in hand with information manipulation.

The main panel discussion focused on identifying ways for governments, organizations and platforms to address disinformation without resorting to online censorship, bans or internet shutdowns. There was broad agreement that disinformation is most effectively addressed by means of wholistic and multifaceted policies, and that this approach is also rights-compatible. Successful implementation of this approach must be based on a conceptional framework that identifies and defines the challenges. It starts with a critical examination of the true cause and context of disinformation and of the risks it presents to society at large. Free and independent media and better public communication are the best tools to fight disinformation. A free, open, safe and secure online environment is most resilient to disinformation.

Key points by each speaker:

Allan Cheboi: Misinformation is false information without intention to do harm, but disinformation is false information that is used to harm or influence other people’s decisions or thoughts. Disinformation is also used to gain power. In most of the cases, misinformation misrepresents facts, while disinformation is centred around a narrative. To address disinformation, we need to customize laws to include information monitoring. Specifically, for the local disinformation attempting, for example, to influence the outcome of elections. We need to make substantive investments to cope with the challenge. Another alarming development is disinformation targeted at the United nations (UN) peacekeepers. That needs to be tackled swiftly and decisively.

Rihards Bambals: Disinformation is false or misleading content disseminated on purpose to mislead and to gain political benefit. Disinformation is a global man-made disaster that is hazardous and influences vulnerable people. Latvia addresses the challenge with centralized information environment monitoring capabilities covering both traditional and social media. Latvia’s strategy is based on three pillars: effective government communication, quality of independent journalism and media, societal resilience. It is of utmost importance to invest in media and information literacy. Governments need to strengthen citizens’ capacity to think critically and recognize and report disinformation cases. Some governments invest billions in spearing disinformation. One example is the Russian Federation’s massive disinformation campaign accompanying its military aggression in Ukraine.

Lutz Gullner: First, we need to define the problem and distinguish between misinformation and disinformation. Misinformation is false information with no intention; disinformation is based on a clear intention. Disinformation can be used as a way to gain economic benefits. There are five characteristic elements we need to look when distinguishing disinformation: harmful, illegal, manipulative, intentional, coordinated. The European Union uses the ABC model which stands for: A – actor, B – behaviour, C – content. This is a technique for distinguishing disinformation and identify actors that are trying to manipulate the information. The approach allows governments to prevent censorship and look at given information in an objective manner.

Anna Oosterlinck: Disinformation must be seen in a wider context, including: reduced pluralism and reduced diversity of information that we can access online; challenges connected to the digital transformation of media; underlying social causes including economic and social inequalities leading to mistrust and polarization. All these factors combined create environment where disinformation can flourish. Disinformation has been addressed in some laws as restriction on false statement of fact that can cause substantial harm or laws on election fraud or misleading advertising, or sales of certain products.

We need to fight disinformation with a number of positive holistic measures by a range of actors. To fight disinformation, we need free and independent media environment, strong protection for journalists and media workers online and offline; implement a comprehensive right to information laws including by complying with the principle of maximum disclosure of information and by proactively releasing information of public interest. Governments should not spread disinformation themselves; they need to ensure connectivity and access to free Internet; invest in digital media and information literacy; adopt positive policy measures to combat online hate speech; work with companies to make sure they respect human rights.

Melissa Fleming: Back in 2018, the UN found that disinformation and hate speech online played a significant role in stoking horrific atrocities against the Rohingya population. They pushed ordinary citizens to commit unspeakable acts. And similar stories have emerged in many other conflict settings. For example, recently in Ethiopia Facebook posts spread hate and inspired attacks. In Ukraine, information is also being used as a weapon. Meanwhile, in Ukraine's neighbouring countries, we're seeing how spreading of lies about refugees brings more suffering for the most vulnerable.

Free speech is not a free pass. In the era of mis- and disinformation, free speech is much more than the right to say whatever you want online. Platforms must face the fact that they are constantly being abused by malign actors; they must live up to their responsibility to protect human rights and save lives. United Nations are constantly engaging with the platforms and advocating the need for the platforms to do their due diligence on human rights and to review their business models against the UN guiding principles on business and human rights. The platforms should offer a robust framework to reduce the dissemination of harmful falsehoods, as well establish remedy mechanisms.

The UN “Verified Initiative” succeeded in getting accurate lifesaving information to communities around the world during the COVID-19 pandemic. The UN is also working to strengthen the capacity of social media users to identify and avoid falsehoods by promoting media and information literacy and by creating own teaching tools. The UN has launched two free online digital literacy courses on mis- and disinformation in collaboration with Wikipedia. The courses are in multiple languages and are being taken by students of disinformation all over the world, hopefully improving their ability to identify mis- and disinformation and not become part of the spreading cycle.

The UN also encourages governments to promote various measures to foster free flow of information, enhance media diversity and support independent public interest media as a means of countering disinformation.

Summary/Conclusions:

  • Disinformation is false information put in circulation to intentionally do harm or to gain political or economic benefits. 
  • Disinformation can and should be addressed without resorting to censorship.
  • We need to keep developing our conceptual framework on disinformation and related phenomena.
  • Policies to address disinformation should focus on fostering a free, open, safe and secure online environment and on strengthening free and independent media.
  • The online platforms need to improve their efforts to better address disinformation.
  • It is important to strengthen citizens’ ability to identify and counter disinformation by investing in digital and media literacy programmes.