IGF 2023 – Day 3 – WS #209 Viewing Disinformation from a Global Governance Perspective – RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> ANRIETTE ESTERHUYSEN: Good afternoon, everyone.  Welcome to Viewing Disinformation From a Global Governance Perspective workshop.  My name is Anriette Esterhuysen.  I will be moderating.  I'm senior advisor Internet Governance at the Associations for Progressive Communications.  I will introduce you to our very diverse and very interesting panel as we do the workshop.  At this point because we're a small group of people, I want to ask you all to stand up.  We will start with an interactive exercise.  Professor, that includes you.  So just stand up and line up.  I will make a statement.  And then I'm going to ask you to position yourself ‑‑ I mean, around this corridor and that corridor more or less.  On that side of the room, if you strongly agree with the status statement.  Towards this side if you strongly disagree.  Then somewhere in the middle, according to how strongly you agree or disagree.

The statement is absolutely you, otherwise you will not get a chance to speak.  Disinformation is undermining democracy. 

If you agree, go towards that end.  Disagree ... somewhere around here. 

And somewhere in the middle.  And the idea is as other people speak, I'm going to ask people where they stand where they stand.  Then think about it, reflect and decide whether you want to move along this imaginary spectrum. 

>> (Off mic)

>> (Off mic)

>> ANRIETTE ESTERHUYSEN: Drake then you have to stand in the middle. 

>> (Off mic)

>> ANRIETTE ESTERHUYSEN: Disinformation undermines democracy.  Okay.  I will start over there with somebody right on the other side.  Anybody willing to ‑‑ why?  Why do you agree so strongly with this?  And just introduce yourself and say why. 

>> ATTENDEE: Hi, I'm Greta.  Um ... hmm.  I think it is hard to explain, but the feeling that there is information outside that can match how democratic institutions work and function.  Yeah. 

>> ANRIETTE ESTERHUYSEN: So you feel that misinformation can actually undermine the institutions of democracy.  Anyone else here who feels strongly? 

>> ATTENDEE: Hello, my name is Rémy Milan.  The reason I would say undermining is that misinformation or disinformation undermines citizens' belief in the state.

>> ANRIETTE ESTERHUYSEN: You should move if he moved you.  Let me move to that side of the room.  I haven't seen anyone move yet.  Jeanette, you are a speaker, introduce yourself, why are you standing in this side.  Why do you disagree? 

>> Jeanette Hofmann: I anticipate what I want to say.  One of the main things I want to say is while we have a lot of research on generation and circulation of disinformation, we know little, if not nothing about how it actually affects people's minds and people's voting behavior.  A lot of what we see here discussed is sort of based on assumptions.  But not on empirical evidence.

>> ANRIETTE ESTERHUYSEN: Introduce yourself. 

>> Jeanette Hofmann: I'm a speaker Jeanette Hofmann a political scientist from Germany. 

>> ANRIETTE ESTERHUYSEN: You want to react?  Good, you are allowed to. 

>> (Off mic)

>> ANRIETTE ESTERHUYSEN: You disagree with Jeanette.

>> ATTENDEE: I would say since I live in the United States, January 6 was a good example of events inspired by misinformation and disinformation, designed specifically to undermine the democratic transfer of power between two Governments.  That I think I watched on TV, I actually believe that I saw something.  So I think there is empirical evidence that people can be driven to act contrary to democratic institutions by misinformation and disinformation.

>> ANRIETTE ESTERHUYSEN: Any of our online participants, including speakers who strongly agrees that disinformation undermines democracy? 

Clara, David, Aaron?  Nighat Dad introduce yourself.

>> Nighat Dad: I'm Nighat Dad and I run digital rights Foundation.  I'm speaking from Pakistan right now.  I'm in between agree and in the center.  So I was very much agreeing, then Jeanette said, okay, I want a little more went to her side, because we have been saying this over and over again, yes disinformation and misinformation impacts our democratic processes, but what kind of evidence we have to support actually that?  And a very solid evidence of how it basically changes behavior of people during electoral processes.  However, we see a lot of impact and effect of disinformation on several institutions.  And also on voters.  But I think we still need to have a solid sort of evidence in terms of supporting that yes, it is actually, you know, deteriorating our democracy in our countries.  I feel like there are several other aspects this impact this sort of destruction of our democracy processes. 

>> ANRIETTE ESTERHUYSEN: Thanks, Nighat Dad.  Before Professor before you strongly disagree as well?  Are you willing to share why? 

>> ATTENDEE: I wouldn't say I strongly disagree.  I'm not over there. 

>> ANRIETTE ESTERHUYSEN: That's true.

>> ATTENDEE: I'm sort of slightly left of the middle or right of the middle, I guess.  I think ‑‑ I'm from a think tank we do some work in this area.  My background is also legal.  And I frequently find it quite difficult to sit along a definition of disinformation and a whole separate question of how we actually apply that definition.  I think a lot of people are thinking of difference things.  They say they're thinking about disinformation.  I think it is a longitudinal impact of something that is hard to assess.  I think we think about the causative impact of disinformation, it is very hard to extract substantive grievance from it.  So is it a manifestation of people's dissatisfaction with the way society is we can observe and measure in ways we couldn't in the past or is it the disinformation that is sort of causing that dissatisfaction?  I think that is a hard thing to unpack.

>> ANRIETTE ESTERHUYSEN: The unknown.  Those that came late, I made a statement I said disinformation is undermining democracy.  And the idea is that people who disagree are kind of over there.  People who agree are over there.  Drake, you are one of our speakers, introduce yourself.  What is your position?  Why are you in the middle.  That is not common for you.  I have known you for years and you rarely sit on a fence. 

>> William Drake: We had a session on this at the Taiwan IGF.  Beware of false narratives.  The way impacts democracy and that is dependent on context, et cetera, et cetera.  I would add to Jeanette that social science ability to do effective polling in here may not be the only possible measure.  We have tens of millions of Americans that demonstrated over and over that they believe disinformation.  So ...

>> ANRIETTE ESTERHUYSEN: You see this made me move more to the center.  Professor, you wanted to say something? 

>> ATTENDEE: Professor Lee from Taiwan.  I have three things.  Disinformation citizens create the spread of disinformation.  Not only the news media.  Sore social media.  Don't forget that.  Politicians do it too. 

Second of all, I don't fully agree with what Jeanette said.  The reason is she didn't put a time in that coordinate.  But if you put a time, because when the disinformation spread around, don't forget it, the voting is at just a second.  So in that second, you might be moved by the disinformation, change your voting behavior. 

>> ANRIETTE ESTERHUYSEN: Brexit. 

>> ATTENDEE: One minute after, you regret.  Too late, it is done. 

>> ANRIETTE ESTERHUYSEN: Exactly.

>> ATTENDEE: I stand in the middle.

>> ANRIETTE ESTERHUYSEN: Like the Brexit referendum.  So many regret it.  Before we sit down like normal IGF panel with some on this side the rest of us on this side.  Before we assume the normal divisions of power.  Anyone else on this side of the room that agrees strongly that disinformation is undermining democracy? 

>> ATTENDEE: My name is Jan hope from media alternatives, from the Philippines.  And if you are aware, who was our previous President and who is our current President?  Marcos, the son of the former dictator.  I think it would be easier to appreciate why Civil Society at least feels that disinformation plays a huge part as to why our institutions right now and democracy in general is very much in peril, if not already gone.

(Chuckling)

So to speak.  I can understand, though, the search for empirical data.  I'm a sociologist and a lawyer.  But we live behind the studies, we live through the realities every day.  And we have gone through our elections last year.  And we saw how much that disinformation actually impacted people across not just those supposedly in the marginalized Sectors, even those who are actually you would expect to are learned individuals.  They too were very much caught into that web.  Yeah. 

>> ANRIETTE ESTERHUYSEN: It is true.  But you also did have decades of Marcos dictatorship prior to online disinformation.  Okay, the last word.

>> ATTENDEE: Okay.  Thank you for giving me the floor.  I'm Tim from Russia.  I work for a think tank that counters disinformation fakes in Russia.  I have project experience with that.  Somebody use disinformation, it is invisible because the ability to make a mistake is a natural part of humankind and human brain, actually.  So as far as it goes, disinformation and fake news are extremely effective weapon and nowadays, it is widely used as a weapon.  You can get it especially in the Russia and Ukraine war and a lot of the usage of the weapon against us.

Last but not least, actually when you fight disinformation, it is never possible to ground zero the misinformation and fake and myths or anything else that is possible.  But what you actually do is you actually fight the consequences and damage of the disinformation and spread of the disinformation, but you cannot fight the disinformation itself.  Thank you. 

>> ANRIETTE ESTERHUYSEN: Thanks very much.  Thanks very much everyone. 

(Applause)

Let's take a seat.  Come and sit in the front. 

>> ATTENDEE: While people are taking a seat, can I answer. 

>> ANRIETTE ESTERHUYSEN: Yes, David, go ahead. 

>> DAVID KAYE: It sounds fun, it is 6:00 a.m. where I am.  I could use the stretch.  I will be a little bit of the diplomat.

>> ANRIETTE ESTERHUYSEN: Just introduce yourself to the room.  >> DAVID KAYE: I'm David Kaye.  I teach law the University of California Irvine.  I was the U.N. Special Rapporteur on Freedom of Expression from 2014‑2020.  I'm the independent Chair of the Board of the Global Network Initiative. The diplomatic thing I would say is I agree with what everybody said, but the thing I think there are at least three different issues here that from an IGF Global Governance perspective, make it very difficulty to talk about disinformation.

One is it really matters ‑‑ this came through in the comments already ‑‑ I think it really matters who is doing the disinformation.  Is it the state?  Is it the President?  Is it ‑‑ who is doing it?  And also where is it happening?  As far as our study and understanding of the impact, I think those things matter. 

Second thing, what impact do we actually care about?  Do we care about whether disinformation has an impact as Jeanette was suggesting on voting behavior?  Do we care about the impact that disinformation has on people's action in the street as in January 6? 

You know, what is the thing that we're actually studying?  And third, based on all of the different contextual features, I think it is a very difficult to say that there is one particular solution that works at a Global Governance level that would address disinformation in every instance or every context, whether it is in legacy, broadcast, media, print or on social or search, there is such variety.  So I think that it is just a vast topic that requires nuance and may not be really amenable to a one‑size‑fits‑all response. 

>> ANRIETTE ESTERHUYSEN: Thanks very much David.  Clara, do you want to reflect on this opening statement?  And then also police introduce yourself.  Tell us where you are. 

>> Clara Iglesias Keller: You Anriette Esterhuysen.  I'm Clara, I'm a legal scholar from Brazil, but it is 6:00 a.m. here continue keeping strong.  Thank you for having me.  I will jump right in.  I will say that I would be alongside David, kind of in the middle, a little bit more on the diplomatic side.  I will tell you why.  I think disinformation can undermine democracy depending on context.  I will prepare myself to talk about that.  Right?  So depending on sources, depending on political social context what disinformation strategies are being pursued.  I think they can undermine democracy.  But not in the ways we would instinctively think of.

I agree with Jeanette, we don't have solid empirical information it changes voters decisions.  Don't have solid evidence it changes elections.  I think we have enough reason to worry about what ‑‑ how exactly it weighs in recent political transformations that we have seen in different contexts.  And I think this shows that we need more research on that, on the blind spots, that we're not exactly unfolding yet. 

I will say one thing about the empirical evidence to close the first statement.  Which is to say this empirical evidence we have these days is very much focused on U.S. and Europe.  So I would be very happy to see more empirical evidence of how disinformation unfolds in other political contexts, what we call the Global South, but Latin America, Africa, Asia.  That would be it for my first input.

>> ANRIETTE ESTERHUYSEN: Thanks very much Clara.  We do not yet have our final speaker, Aaron Maniam.  I don't think he is with us.  I want to introduce our remote Moderator, she's in Nairobi.  Can you reveal yourself on screen, put your camera on and say something. 

Can you unmute Arispa?  She was there.  She might have dropped off.  She is trying to get hold of Aaron. 

Let me turn to Jeanette and David you can add on this.  And Nighat Dad.  Our opening question to you was not that different from what I asked the room.  Do you think you can define it, what is disinformation?  Is it serious?  If so, why? 

>> Jeanette Hofmann: (Off mic)

Disinformation focuses on strategic intent, meaning the strategical circulation of disinformation with a specific purpose and that usually is to manipulate people in their behavior and in their world views.  So that is what disinformation is.  But now, I would like to come back to what I said earlier.  That we really lack empirical evidence.  I would like to elaborate a bit on that. 

We have right now a strong focus in Academia on platforms.  While that makes sense because we get at least some data about the circulation of disinformation, there is also a little problem with that.  Because it cuts off longstanding research traditions that focus already on the question of propaganda, manipulation and its effects on people's minds.  At that time, it was less about platforms and more about legacy, mass media.  But also political propaganda.  I have to say that there was never agreement on the question of whether it had a long‑term effect on people.

There is a strong opinion in Academia saying it is the effect is amplifying what people already believe.  If you have a tendency to believe in conspiracy theories, you might be open and receptive to disinformation.  If you are immune to that, it might not have much of an effect on you.  There is a hypothesis called preaching to the choir.  Disinformation reaches people who are vulnerable.  That's point one.

Second one I would like to make is that a lot of research focuses on the individual.  But what really matters when it comes to disinformation is the media environment.  Countries with a strong tradition of public media, of high quality media, they're able to counter disinformation much better than countries where the media landscape is a mess.

So if we want to learn more about whether or not disinformation works, we need to look beyond platforms and take into account the wider media landscapes, context matters.  And there is no point in only looking at platforms and their algorithms. 

>> ANRIETTE ESTERHUYSEN: It is also interesting, I mean, the ‑‑ it kind of struck me the remark about the U.S. and the comment from the Philippines.  And maybe there are cultures, media cultures and populist political moments that might also be similar.  There could be other contextual issues.  Bill, do you want to comment on your earlier remarks.

>> William Drake: Further to Jeanette's point, a lot of disinformation is not originally coming from social media, it is from broadcast media and amplified by social media.  Talk about the strength of the traditional media being the buttress of spreading misinformation.  The United States has a strong and vibrant media system and has for a long time.  Multiple voices, well funded, et cetera, et cetera.  We have three networks that get tens of millions of viewers that are completely all in on disinformation. 

I mean, who are major proponents of disinformation.  And who attack anybody who questions disinformation as being somehow trying to suppress of freedom of speech and so on.

This ‑‑ so we have a certain problem in the U.S.

But I guess the point I would make is, you know, between traditional media as the savior and platforms as the source of all problems, as again, what I said at the outset, it all depends on the context of what we're talking about

On the definitional issue, you know, Jeanette said and we all kind of agree on this.  It is interesting.  I was looking at some of the different definitions put forward by different leading organizations.  And it is amazing how variable they are in details.  Maybe this is because I spent a lot of time in the Internet Governance space working on the definition of Internet Governance 20 years ago in the U.N. Working Group.  And so on.  But I tend to look at these things and think why are they doing it this way? 

For example, the European Union describes the disinformation as the creation, presentation and dissemination of verifiably false or misleading information for the purpose of economic gain or intentionally deceiving the public.  Why economic gain or intentionally deceiving the public?  There are different ways to formulate the strategic objective.  The U.N. Special Rapporteur says false information disseminated intentionally to cause serious social harm.  It is only disinformation if it is intended to cause serious social harm?  What constitutes serious social harm?  Can you play with the definitions and they can be messy.  Obviously we want a definition in an understanding that captures the notion of intentionality and verifiable falsity.  We want to capture that there is a strategic dimension.  And people recirculate disinformation not knowing it is disinformation.  When they do it, it is misinformation I suppose you would say, right?  Because they're not seeking necessarily to cause serious social harm.

With their weird uncle sent them this picture saying, you know Hillary Clinton is the devil and eats babies.  They think maybe that is true and send it to their friends. 

You know, this is kind of crazy stuff we have.  But back to Jeanette's point and I will stop on the U.S. thing.  It is true that social scientists, as just amplified what I said before.  Social scientists always have trouble demonstrating impact.  We have 60‑something years of media studies before the Internet where people tried to impact look at the impacts of media, media effects and how strong or weak they were.  Did they cause violence, sexual predation, whatever, et cetera.

It is hard to get at that through polling data, so on.  When tens of millions of people vote saying they do so informed by strong disinformation, this would seem to be relevant information to me.  Anyway. 

>> ANRIETTE ESTERHUYSEN: Thanks, Bill.  Nighat Dad, do you want to add anything to your earlier opening remark on the concept and the issue? 

>> Nighat Dad: Yeah.  So based on my work wearing different hats, you know, someone working in Pakistan looking at the context, but then someone sitting at Meta's Oversight Board, content that is related to misinformation, it is complex to define it as misinformation and disinformation.  And even more complicated to identify it.

I think definitions are very, very contextual.  Some of the definitions, Bill mentioned and I'm like okay, but, you know, like actors, Civil Society actors, States, companies, sort of define the things according to ‑‑ keeping their interest in mind as well.  Or the work they're doing or the context they belong to.

For instance, disinformation for us, how we kind of see it as you know, for false content, which is shared with the intent of causing harm.

But then, we cannot assume all untrue information is harmful.  We should be very cautious of, you know, defining it in a way where, especially not the Civil Society, but the powerful actors, when they define it, that means they're going somewhere to regulate it, right?  That is where I see a problem is.

So I don't know what else to add here, but I feel that it is very contextual.  I don't know how many of you have seen the UNSR's report on gender misinformation.  Which was released a couple of days ago.  It has a context of all the Regions.  And it is actually mind‑boggling to see how disinformation causes harm to woman and marginalized groups in South Asia and how it does it in Latin America.  So I feel like it is such a good document that people should read.  It is very recent.  At the same time I feel it is contextual.  We should be very cautious when we're defining it and not giving ‑‑ not leaving a space where you know like we should basically give space to people to interpret it the way they want to, according to their context and how they see their political situation is. 

>> ANRIETTE ESTERHUYSEN: Thanks, Nighat Dad.  A little plug for my organization.  Use the IGF.  This show you how the IGF can be useful.  A consultation with the Special Rapporteur on gender disinformation.  She used the input from that consultation in her report.  David, any additional points from you? 

>> David Kaye: I think the only thing I would add is it put sort of a legal sort of gloss on all this, which is to emphasize why definition is so important.  If we're talking about legal representation, when we talk about Governance, that is what we're talking about.  We have to be clear on what the issue is.

That is not just some abstract issue.  It is a fundamental component of legality.  We need precise definition.  You know, precision in what it is that we're actually restricting.  And one of my big fears is that even though I share the view that there is a problem called disinformation and there is a wide variety of impacts of disinformation, we don't really have clear definition.  And we see that, I think even in the emerging regulations from places in Europe and the UK and elsewhere.  Because what we see there is this move for transparency and risk assessment that assumes that in those cases, platforms will sort of define the issue as they're doing that work.  Maybe that is okay.  Maybe that will be great evidence for social scientists and for legal scholars, but I'm afraid we're not at a point where we have shared definitions that are clear enough for legal regulation.

>> ANRIETTE ESTERHUYSEN: Uh‑huh.  Thanks, David.  Jeanette, you want to react? 

>> Jeanette Hofmann: I wanted to address the question of motives.  Because many people of course for good reasons, refer to distinct episodes like the attack after the last election in the U.S., on the White House.

Empirical research shows that many people who act on disinformation and share disinformation do not necessarily believe that this is the truth.  One reason why people share disinformation is to signal belonging.  To express their loyalty.  To a certain way of thinking and acting.

Many people say European Unions ‑‑ Republicans that share information about the last election being stolen might not necessarily believe this.  What they're expressing is their loyalty, their belief in Trump and this sort of crowd of Republicans.  We even have evidence that when people are asked do you believe what you are sharing, that they might not tell you the truth.

This brings me to an aspect that I find actually alarming.  It is less the amount of disinformation, but there is a growing amount of people in various countries who do not care any longer about the distinction between truth and falsity.

For them, political belonging let's call it tribal attitudes are becoming so strong that they are more important than whether or not there is a truth to strive for.  And that is what I think undermines public discourse and therefore democracy. 

>> ANRIETTE ESTERHUYSEN: Thanks, Jeanette.  Clara I can see you also want to react.  Go ahead. 

>> Clara Iglesias Keller: Then you Anriette Esterhuysen.  My kid is waking up.

>> ANRIETTE ESTERHUYSEN: Tell them to make you coffee. 

>> Clara Iglesias Keller: Unfortunately not possible yet.  I agree there is conceptual inconsistency, we have more on the media communication disciplines of one of the many phenomenons that comes along misinformation, propaganda, fake news, I like distinguishing things by intention.  It makes all the difference for the law.  We offer run into social science or communication scholars, but you can't tell much about intent, when in fact, intent holds up a big chunk of legal relevance, right?  What I want to say is beyond being communication practice or being a sort of online harm, which is also another way in which you refer to disinformation a lot.  I think we need more definitions or more efforts to conceptualize disinformation within political theory, within political practice.  Right?  So I think it functions a lot as a form of political intervention that takes shape in different context as Nighat Dad was saying as well.  I come from Brazil.  This type of intervention serves as a means to show dissatisfaction as somebody pointed out in the beginning of our panel.  Or to directly attack democratic institutions in particular the electoral system.  And high courts.

So I think we do have this varied but not complete enough conceptual framework.

I will say one very quick last thing.  I don't think there is space in statutory regulation for the concept of disinformation. 

>> ANRIETTE ESTERHUYSEN: I will ask you about regulation next.  Bill, you want to react?  Is it about the opening segment or do you want to talk about regulation? 

>> William Drake: I want to surprise Jeanette by agreeing with her strongly.  We have argued about this for a while.  In the outset I was in the middle, it depends on context and so on.  There are lots of people who say they believe in disinformation because of tribal loyalty.  No question about that in the American case.  Indeed one thing that really happened ‑‑ this goes also to not believing and truth versus fiction, not believing there is a boundary.  There are a lot of people that built their identity around giving the finger to the other side.  Right?  It is like, if you hear a lot of people going they interview people at Trump rallies and say they believe this stuff.  Somebody asks them but you saw this.  Yeah, well, whatever, screw that.  The libs don't want us to think this.  It is all about owning the libs, owning the other side.  Giving the finger to the people you don't identify with.  It is like a pretense in a way.

It does mean it is harder to disentangle.  Doesn't mean it is not important.  It means it is directly causal is complex.

>> ANRIETTE ESTERHUYSEN:  I want to ask everyone in the room.  Is there anyone here in that lives in a context where disinformation is not particularly prominent or influential?  Anyone?  Just come and tell us about it.  Come to the mic and tell us where you are from and why you think that's the case.  (Off mic)

>> ATTENDEE: Hello, everyone.  Well, I'm from Switzerland.  I wouldn't say that it is not a problem whatsoever, but I also wouldn't say that it has the ability to sway entire elections.  We have a multipartite system.  People vote for the same party for the last number of years.  Not much has happened in that respect.  We had the same challenges with COVID as everyone else.  But I think I would be exaggerating massively to say this is the topic for us to focus on. 

>> ANRIETTE ESTERHUYSEN: It is really good to get a Global North perspective.  I'm from South Africa, I can say that disinformation is not a major problem in South Africa.  Believing the Government is a major problem.  But what ‑‑ there is often engagement in the media about whether information is false or accurate.  We have very well self‑regulated media that deals with disinformation.  And the public tends to participate in that.

So fact checking is something that happens on a daily basis very quickly.  A politician will make a statement on television one night.  The next day, the media will fact check it and the public does tend to believe the media.  What we find quite unique or interesting is whether it is Right Wing, center, or Left Wing media, there is a common commitment across that spectrum to accuracy, which is because the media self‑regulation works. 

It is also not a major concern.  I want to move on to the next question.  In a way, the heart of why we convened the workshop.  By the way, the three of us argued a lot in the course of the workshop planning and design.  It is for us also not a clear issue.

The heart of this is the IGF.  And it is about Governance.  Do you think that we have ‑‑ do you think we can regulate disinformation effectively internationally?  We know there is a lot of national initiatives that are being put in place and that are quite controversial.  And what do you think?  Do we have the baseline?  Do we have a strong and clear baseline that existing international instruments can provide for the Governance of disinformation?  And if we move in this direction of international Governance of disinformation, what are the implications for access to information?  And freedom of expression?  David can I ask you to start us on that?  I know this is something you have applied your thought and your expertise and experience to. 

>> David Kaye: Yeah, thanks, Anriette.  So my view is that Global regulation of disinformation, if we think of that as a concrete set of rules that will guide decision‑makers in every context and have a global oversight is a chimera.  That is not ‑‑ we're not going to achieve that and it is not worth, in my view, even trying to achieve it.

What I do think is that Government and platforms and media outlets need a common Celt ‑‑ set of principles to guide how they think about this and react to it.

To my mind, this is no surprise to people at least who know my work.  I think that the principles are rooted in human rights law.  There is very, very good reason ‑‑ because we're talking about information, the sharing of information, the possibility of making it harder for individuals to find accurate information.  We need to have standards that are based in Article 19 on principles of legality.  And proportionality and legitimacy of the objective

I will end here, I think that there is a way in which resourcing human rights mechanisms in particular.  By that I don't mean the Human Rights Council, but rather the Human Rights Committee, Regional Human Rights Courts and others.  Resourcing them.  Ensuring they have the tools to answer questions when individuals feel that their Government is interfering with the right of access to information by either disinforming themselves or permitting disinformation in one way or another.  That those kinds of tools can be a mechanism for Global Governance, but not in the maybe IGF sense of what Internet Governance looks like. 

>> ANRIETTE ESTERHUYSEN: Thanks, David.  Clara? 

>> Clara Iglesias Keller: Yeah, I really enjoy thinking about this question.  It was really provoking to me.  I am not sure about the extent to which Global Governance solutions can help us.  I will try to summarize this briefly in two points. 

First, because I feel this urge to look at disinformation's role in political disputes.  And it becomes clear to me that countermeasuring it or mitigating it is not only about a communication practice in itself, if it is being used by certain political and economic interest that have been shaping our societies for so long.  Then mitigating the information is also about confronting ourselves with these broader issues.  To stick with the Brazilian case, I think about the ultimate convertibility of economic power and political power and that includes a traditionally concentrated and unregulated media landscape, especially when compared with other western democracies.  So I think that is certainly something to be part of the conversation. 

But even in the interest of getting more granular, it is okay.  It is fair to say we need action that targets disinformation specifically as well.  And here, I'm afraid I'm also skeptical, because Global solutions mostly presume consensus based Governance structures.

As David put very well, at least Global Governance in the IGF sense of things.  That does not include an authority enforcement, right?  I think even though it offers very interesting guidelines, human rights standards, I'm afraid that confirmation of current digital business models and data usages will need more than that, to mention a few things that should be on the regulatory Agenda. 

>> ANRIETTE ESTERHUYSEN: Thanks, Nighat Dad, do you feel we have got the instruments needed?  How do you feel about the idea of Global Governance of disinformation? 

>> Nighat Dad: Yeah, I mean ... everything that has been said by David and Clara, I agree with that.  I think we already have Global Governance instruments with us in terms of several principles or conversations that have taken place, Resolutions, all of that. 

But I think we also need to see what actors have learned out of those instruments and tools that we have globally ‑‑ which have been developed globally.  I don't think powerful actors have learned that.  If you look at national policies, regulations, and laws that are being drafted and introduced not only in Global majority but also Global North, those policies have the appetite of suppressing freedom of expression, right?  I don't think they are able to identify how certain content can actually cause real world harm.

And what David basically said, you know, especially state actors have this not only companies, but state actors also have this obligation under human rights standards to provide accurate information and prevent misinformation.  I mean, we have robot plan of action.  We have so many instruments out there.  I think we really need to see how different components that are already out there can complement each other and do not work in silos.

There is several ‑‑ you know, we'll be talking about the Governance mechanisms, you have already mentioned there is like certain, you know, Oversight Board or other things that are already out there.  How those are performing?  Are we looking into the performance?  What they're adding into the existing ecosystem? 

So I think we already have so much.  I think we just need to know how to use that. 

>> ANRIETTE ESTERHUYSEN: Thanks for that Nighat Dad.  Jeanette? 

>> Jeanette Hofmann: The whole question reminds me of the early days of Internet Governance, when it was always clear that protocol, standards for the infrastructure we had to agree upon to have a global network but the upper layers, particularly content, that should not be done on a Global level.

Having said, that ‑‑ that, there is one aspect I find interesting in this context.  This is the digital service act that the European Union Commission just agreed upon, and that will take effect early next year.  That has one aspect at least from a German perspective is really interesting.  It concerns the scope of human rights.

At least in Germany human rights regulate the relationship between citizens say people and the Government.  But the DSA mentions at several points that human rights should also guide the behavior of platforms.  Meaning the scope of human rights begins to integrate also Private Sector action because it affects to such an extent our conditions and possibilities to exercise human rights.  So this is an interesting development and we can think about extending that to other jurisdictions or world Regions.

Actually, I would like to know what our other panelists think about that. 

>> ANRIETTE ESTERHUYSEN: Thanks.  Bill, do you want to react to that.  Aaron has now joined us and we'll hear from him next.

>> William Drake: I don't want to react to that, I want to say something else.

>> ANRIETTE ESTERHUYSEN: Go ahead, you can. 

>> William Drake: The way the question is posed, will it be effective.  Probably not.  We have to build up the operative infrastructure to continually challenge disinformation.  But to obviously believe it will effectively regulate it at a Global level is a little bit farfetched.  But we have to try.

That said, it is worth highlighting because it is the IGF and talking about IGF with dis.  One is the U.N. discussions around cybercrime and cybersecurity.  You see a lot of language around disinformation.  It is the difficulty in doing this a multilateral level.  In the sign crime Treaty negotiations, China proposed language saying all Government should adopt laws saying ‑‑ calling spread of disinformation a criminal offense.  And they described it as anything that makes available false information that can result in serious social disorder.

About, again, what can result in serious social disorder is in the eyes of the beholder.

The UNESCO guidelines for the digital platforms, they hope to finalize.  That has language that is germane to disinformation as well.  The model of adopting guidelines or suggesting guidelines to countries, you know, there is the possibility that some countries will implement those guidelines in ways that are restrictive of freedom of expression and will claim international legitimacy in doing so.  The question of guidelines versus Treaties is an issue.

The last thing I will mention, the U.N. Secretary‑General is proposing a Code of Conduct for information integrity as part of the Global Compact discussions.  He wants to have it discussed at the Summit of the Future.  If you have seen the document, he proposes a Global set of guidelines drawing on the UNESCO experience that is based on nine principles.  Many of which are pretty much focused on platforms.  How platforms behave and how stakeholders behave. 

This is ‑‑ it is easy to say, well platforms have to adopt rules about transparency, discouraging information to allow scholars to access the data, so on.  It is harder to say States should not disseminate this kind of information in the first place.  Or all stakeholders must abide by good taste and common sense.  Those are a little bit hard to achieve, particularly through guidelines.  That is what the Secretary‑General is doing.  And he's actually calling also for the establishment of a quote dedicated capacity in the U.N. Secretariat to monitor and advance all of this.  Which is an interesting thing.  There is a lot of discussion about whether new organizational structures would be built through the Global Digital Compact particularly in New York and this is one where I think they might get political traction in saying there ought to be a centralized mechanism for at least tracking progress in addressing the issues, tracking progress in adopting complementary kinds of guidelines.  So on.  We'll see. 

A lot going on at the international level.  It is worth talking about that. 

>> ANRIETTE ESTERHUYSEN: Thanks.  Thanks.  Aaron, Aaron Maniam, you are with us now.  Please introduce yourself and tell us what you think about this Global Governance response to disinformation. 

>> Aaron Maniam: Thanks, Anriette. Apologies to everyone for joining late.  I had technical challenges, which I suppose are ironic given this is a panel at the IGF.  I am calling in from Oxford, the Internet capital of the world in Internet connectivity. I'm glad I'm here now. I love this question on Global Governance.  It gets ‑‑ 

>> ANRIETTE ESTERHUYSEN: Aaron, tell us a little bit more.  I know you're an academic now.  We're particularly interested on your views based on your perspective as being within Government. 

>> Aaron Maniam: Until recently I was a policymaker in Singapore in how it is in the society and what sorts of international partnerships Governments need to embark on and the kinds of regulation to do in the economic sphere and on content.  So I think this is a real commingling of any of the challenges any Government faces.

On the Global front I want to make four points all ger main to  ‑‑ germane to the discussion.  First, we have to figure out what we mean by the Governance.  The basic level of guidelines set out and little else in terms of enforcement     or monitoring.  And maximum like IKO has managed to achieve because we know clear risks and safety issues involved.  And at the moment, you know, we see emerging examples like those Bill mentioned.  The open‑ended Working Group on cyber at the U.N. is trying to achieve a greater set of consensus on some of the issues.  And it is really not clear where we will land, in terms of de minimus or maximum models of global Governance.

Second point that is important, I think it is important to differentiate between the basic standards and more traditional sets of issues to cover in a set of global Governance regulations.  Jeanette referred to this.  The examples of the DSA are important.  In Singapore a set of online regulations.  The UK put it out as well. 

It is useful to ask agenda the  ‑‑ beyond the basic, not getting to a sense of level yet.  The guidelines that Bill mentioned or further regulatory principles in place must be interoperable in different countries.  Don't need to be identical.  Interoperability allows for the systems to talk in a more coherent way.

Two last points that are important that we haven't quite named in this discussion.  The first is what major challenge is in some cases the Government is the source of the misinformation or disinformation, rather than the entity to regulate it.  It makes it more difficult to work with others where they act in misinformation minimizing sort of ways.

And we need to be able to differentiate the two and not let the first group of Government actually end up holding us back on any coordination we need to achieve.  This will be hard of course.  The tech itself is dynamic.  In a sense we're trying to play wham a mole.  I don't know if they call it the same thing, a new problem comes up every week and you find a new way to hit it down.  The regulation has to keep evolving.

That means we need the skills within Government to keep the adaptation going and need the ability to continually update our legislation if we want to do the work well.  It is not impossible.  Can be done.  It means parliamentary capacity will be stretched.  Not just executive capacity because we have to go back, update, refine, make mores more laws agile and adaptive.  Not easy, but we have to deal with it to realize the Global governance we want.

>> ANRIETTE ESTERHUYSEN: Thanks, Aaron.  Maybe that interoperability.  Before I open to the audience, do any speakers want to react or comment on one another's inputs? 

I want to open it.  Sorry, who is that.  Nighat Dad.  Go for it. 

>> Nighat Dad: Anriette Esterhuysen, before we move ahead, one part that is missing is Global Governance.  I raise this in several panels.  It kind of gets lost in the conversation.  And that basically is that how to hold the Governments accountable who actually use these guidelines for their own interest and benefit and make laws and regulations where they can control content in their own sovereign jurisdictions.  But they are not accountable that what are the wrongdoings that they're doing?  I feel like that yes, regulatory mechanisms are good.  But how do all the state actors accountable?  I have had conversations with many Global North actors are like but those Governments will do this anyway.  Then where should we all go?  Do we leave them behind?  Do we leave those users behind?  How we take them along with us?  This is the question I always raise.  It frustrates me a lot.

>> ANRIETTE ESTERHUYSEN: Thanks, Nighat Dad.  I share that.  Bill, you want to react quickly to this? 

>> William Drake: I want to ask the other panelists for their perspective.  I want to know if anybody has views, looking at those online, has opinion on the Secretary‑General's proposals, since it is an important thing.  The Secretary‑General of the United Nations is proposing a Code of Conduct on information incomplete for digital platforms.  That would seem to be an instance at an attempt at Global Internet Governance.  Governing information over the network that fits within the definition of Internet Governance.  I would like to know how people view this initiative, what they think of it in terms of its potential impact, how well it is crafted, et cetera. 

>> ANRIETTE ESTERHUYSEN: Do you want to comment on that?  Any one of you?  Just jump in, if you do. 

>> David Kaye: I will say something very briefly about it.  I think it is an important topic.  I will say two things.  One, I'm ‑‑ I think the process has been interesting, but I'm not sure that Civil Society has played as active a role in helping shape the document as should be the case.

And so there's a process issue at the outset.  And as we look forward to the actual negotiation of an adopted or to be adopted text, if Civil Society is not in the room, if Civil Society is not in the places where there is actual drafting and adoption taking place, I think the legitimacy of the outcome will be suspect.  So that is a process response.  A substantive response is that I'm concerned that it focuses not exclusively but in perhaps an overreliant way on platforms. 

There is absolutely a major role of platforms in the problem of disinformation.  But any process that excludes as we heard, from the situation in Philippines or United States that excluded a call for Governments themselves to be better behaved, to be supporting and resourcing public service media, public broadcasting, any avoidance of that will, I think, make the process sort of at its core not useless, but will really detract from its value. 

>> ANRIETTE ESTERHUYSEN: Thanks very much David.  I will move to the audience now.  I am going to open with the question that actually I'm going to ask the panel to responds to as well.  Which is if we are going to develop Governance initiatives to respond to disinformation, how do we do it?  David just pointed out the risks of fairly top‑down process, such as the one coming from the Secretary‑General's office. 

So how do you consult?  How do you make decisions about Governance responses to disinformation in a way that be effective? 

So I'm opening this question to people in the room and also if you have any questions for the panelists or contributions?  Please stand up, take the mic.  And introduce yourself first.  You can go ahead.

>> ATTENDEE: Thank you.  Myself is Faresa, I'm from Digital Medusa. 

First of all, congratulations on a nuanced and evidence‑based discussion on disinformation.  This has been lacking from IGF this year.  And I think you remedied that with this panel.

Disinformation Governance can become indeed an Internet Governance issue if we rush towards solutions that could affect Internet infrastructure. 

And by talking without evidence, by talking about disinformation and how harmful it is and rushing to Governor and regulate it, we are going to see that it is effects of connectivity.  It has actually happened when Russia invaded Ukraine, Europe decided to do IP blocking of websites of Russia that were spreading disinformation. 

So it is indeed an Internet Governance issue.  We need to at IGF we need to have an absolutely more nuanced approach to the discussion and not rush to any conclusion.  As Nighat Dad said, we also have been active in coming up with solutions anyway.  But so ... I think that's one point that it can be Internet Governance issue.

And we need to monitor that.  And the other is that disinformation is also something that in Declarations like G7, they mention it, they commit to open Global Internet.  But they also talk about tackling disinformation.  Which like what are solutions to fight with disinformation should not affect connectivity and Internet infrastructure.  Thank you. 

>> ANRIETTE ESTERHUYSEN: Thank you.  Wolfgang.

>> ATTENDEE: I'm Wolfgang, I'm a retired Professor.  I can only support what Farsea said.  We always risk in this debate to undermining the right to freedom of Executive Session as laid out in the Human Rights Declaration.  As a member of the old generation, I'm asking myself what is really new here.  I went through the Cold War.  The Cold War was an information war.  So it goes back 500 years when Gutenberg invented the printing press.  They were excited.  And somebody used it to write pamphlets.  Because the pope said there is a misuse of technology.  And then the struggle started, you know, who is right?  Who is wrong?  The critics of the Catholic Church, the pope wrong?  And index of censorship.  It is very close if we continue to debate you end up with censorship.

And the tragedy is although look at Germany in the 1930s.  The Minister for propaganda in the third Reich he had radio as a weapon.  It is like a machine gun.  And people loved him.

Millions of Germans, you know, believed in what this crowd of criminals said to the public.  And the tragedy is we know all this, that complex ‑‑ simple answers to complex questions are given by either idiots or liars.

But the problem is that a bunch of people love idiots and they love liars.  So it means what you can do with it. 

Ca.

(Applause)

You have to invest more in education, creating awareness to enable people to understand the context.  I think context is one way forward so that if you have better information ‑‑ if you have bad information the better thing is to have better information.  Not to cancel it or censor it or something like that.  I think it was mentioned the digital service act, that is one step in the right direction.  We have Facebook Oversight Board, you know ... it is an effort, but I'm also very critical.

I think we have a problem but I don't have a solution.  I think in 20 years we will discuss what could be the solution for this problem.  So we have to go with very small steps to identify where we can minimize the negative side effects of this disinformation. 

You will not be able to remove it.  One idea I have is sincere.  In the ICANN context we have the dispute mechanism for domain names.  The UNDP system allows a broad system on case‑by‑case basis to go through cases taking into the town, regional context and individual constellation.  80%, 90% of context related conflicts in the Internet are relevant for the Region and cultural context and they involve parties.  So it means why not to think about and distributed system on a case‑by‑case basis who goes through certain issues.  And this could create a body of understanding what is bad and what is good.

So that means the voluntary guidelines come in and Best Practices or something like that.

So I fully agree with David that said, you know, at top‑down regulation will never work.  We have Article 19.  We have Article 19‑3 and know how the national security, public order, public health, and moral as possible restrictions are misused by Government.  This is not a Governance issue for the IGF.  This is national issue for national policymaking and we should debate this in the IGF but should not expect that we will find a solution. 

>> ANRIETTE ESTERHUYSEN: Thanks, Wolfgang.  I want to make one point.  I think we have to recognize that the weaponizing of disinformation is very different now from what it was when you had to have ‑‑ to do it via radio.  Or do it via broadcasting platforms.  You needed to have some kind of political power or economic power. 

And in online platforms, it is much more distributed that ability to weaponize disinformation.  I think that is a challenge.  We cannot ignore.  Yes, there are similarities but there are also differences. 

I see you want to have the floor.  I want to know if there is anyone from the Christ Church call in the room to share how they did consultation.  That is important in what we are trying to do in this workshop, getting towards Governance ignition.  Go ahead.

>> ATTENDEE: I have two comments.  The first comment is actually, I have a question about the U.N.  Director General, his saying can really get consensus or any respect from the state, you know, the different state.  Because, well, how he can ‑‑ how he can act that.  And eventually the state agree upon.  That is difficult. 

And the second one, actually is a question, because we are talking about the disinformation.  Because we are living a luxury democratic system, some of us, we have a democratic system and allow you, even the disinformation we can debate. 

Let me ask a simple question.  If least disinformation, most of that, 80% of that is distributed by Government.  In that kind of environment, can you tell me how people can react to that?  Because the disinformation is not from people.  It is from Government.  Or from politician.  What are we going to do? 

>> ANRIETTE ESTERHUYSEN: Thank you.  Nighat Dad made that point.  Anyone else in the audience before we go back to the ‑‑ Christ Church?  Did you want to ‑‑ Paul are you go ‑‑ do you want to say something about it?  I know it is a slightly different topic.  In terms of the consultation process.  If you can say about how you approach finding solutions through consultation?  Apologies for putting you on the spot like this.

>> ATTENDEE: It is okay.  I'm often put on the spot.  I'm just used to taller microphones.  Thank you, Anriette Esterhuysen.  I'm trying to think through the different layers.  The Christ church has been through a journey.  For those that don't know about it, it was started in 2019.

>> ANRIETTE ESTERHUYSEN: Introduce yourself. 

>> ATTENDEE: I will go back.  My name is Paul I'm the Prime Minister representative on cyber and digital from New Zealand.  I lead the team in New Zealand French Secretariat leading the Christ Church call.

For those who aren't familiar with the call it was started after the two terrorist attacks in March 2019 when the murder of 51 worshipers was amplified around the world and found the way into many of your social media feeds and inboxes.

Rather than let that stand, we took a deliberate approach to working with companies, seeing them, Governments, Civil Society in the process of trying to build solutions to prevent that from happening again. 

As we did that, we negotiated a set of 25 commitments that was done in rather a hurry.  Through eight weeks, but actually those commitments proved very durable as we went through the implementation.  We met with Civil Society groups.  In that point, not in the multistakeholder partnership, because we wanted to capture the moment.  We fed that and put a placeholder in the Christ Church call context to get there thereafter.  And get the commitments landed at a meeting in Paris in May 2019.

Thereafter, we worked on the process of implementing specific commitments around things like live streaming, around the ability to detect and deal with terrorist and violent extremist content and increasingly, recently focusing in on issues like algorithmic pathways.  And it is the link to violence.  There is probably connection to misinformation and disinformation and the distribution.  There is certainly a good billion dollars ‑‑ body of evidence showing that disinformation and misinformation can lead to radicalization of violence.  The approach around consultation is over the period of 2019 to build up and establish a Civil Society advisor network that works with the Christ Church call partners and over time to broaden that out to develop what we call the Christ church call community in which all of us involved with the advisory network, or partner can contribute to the discussion of all of the different pieces of policy and problem solving the Christ church call is working on.

Over the course of the year we work on those through the work streams that the Christ Church call leaders ask us to work on.  We hold a Summit where heads of Government, Civil Society, organizations, consider the outputs from that and give direction for the work to take forward in the subsequent year.

It is a different subject matter set from disinformation and misinformation.  We have been careful to distinguish the subject areas.  What I would say is keeping scope really tight has been one of the things to enable the Christ Church call process.  It is not about child sexual abuse or other things.  It is focused on one specific subject area.  I think having a Secretariat that at times is able to build trust across participants to work quietly on issues that can be contentious is an important lesson we have taken from this.

The most important, I think ... analog that can be drawn and brought over into the area of disinformation and misinformation is the strength of our fully multistakeholder model.  We have looked at many multistakeholder configurations, in tech and Civil Society, tech and Governments and having a mix all in the room together is incredibly difficult.  You get aspects of the three‑body problem working at times.  That can be complicated to deal with.

It does mean there is a process of building trust.  I'm becoming as we put it, comfortable having uncomfortable discussions.  I think that is one of the most important things to learn.  At the time we had to systemize it more.  As it is growing, it is tough to maintain that trust across the community.  So there is the multistakeholder construct.  That is more information than you probably needed.  Thank you.

>> ANRIETTE ESTERHUYSEN: Thank you for that.  It is useful, I think it is about the depth of the process, the fact that it takes time.  It takes time to establish a process that is going to help you respond effectively to a problem.  And then I think the focus.  You know, maybe we talk about disinformation but maybe disinformation about sexual and reproductive health is different from disinformation about elections.

You know, so not to go back to the panel.  Bill I will start with you.  This question about how to approach the consultation, the decision‑making process to respond to this from a Governance perspective.

>> William Drake: You keep asking me questions I don't want to answer. 

>> ANRIETTE ESTERHUYSEN: Answer the question that you want to answer.

>> William Drake: I want to reply to David and points made.  Gaway was talking about the Governments being the source.  This is to the initiative.  December 2021 the General Assembly had a Resolution on countering disinformation which recalled, reaffirmed, recognized, highlighted, expressed concern for et cetera, et cetera.  And asked for the Secretary‑General to do a report.  Did a report, the next year, August 2022 and said, one of the big conclusions, States bear the primary responsibility to counter disinformation by fulfilling the rights of freedom of expression and privacy and public participation and called for multifaceted approach anchored to the protection and respect of human rights.  What is he doing in response to that?  Guidelines for platforms.  Because the reality is it is pretty hard for the United Nations to have any kind of really construct of process around Governments being the source of much of the disinformation that matters.  Because Governments reserve the rights under the U.N. Charter to do whatever the hell they want and not constrained.  Instead we focus on the platforms.  I agree the platforms have issues.  I agree it is useful to try to encourage greater transparency, et cetera, et cetera, on the platforms.  That is not ‑‑ you know, telling Facebook that they have to do something because Russia used Facebook to disseminate information does not address the fact that Russia is disseminating information.

It is not just, you know, Facebook, Twitter, and you know ... what am I trying to say.  YouTube. 

There are so many different sources of disinformation.  The Dark Web, so on.  It is an incredibly robust environment.  It is a commercial marketplace.  You can go online and hire people to generate disinformation using bots, so on, so forth.  You can adopt rules for a few of the platforms it won't solve a thing.

The other point is David said we need greater Civil Society participation in the Secretary‑General process.  This is why I brought it up.  Here we are at the IGF, we are having almost no discussion around the various things that are being proposed through the Secretary‑General under the Global Digital Compact.

I have been to many sessions.  We're not doing it.  Not taking advantage of the opportunity to say we the stakeholders not just Civil Society.  But all stakeholders demand right to be here and weigh‑in on the processes.  Nobody knows with it is done or where being done.  It is it outreach to a small set of players.  Something has to be done.

>> ANRIETTE ESTERHUYSEN: I haven't heard anyone use the concept of disinformation panic.  There is a bit of a sense that that multistakeholder and self‑regulatory processes are not dealing with it.  Therefore, the parents need to step in and we need Governmental regulation.  I think in my view, not that there isn't a need for regulatory response, but it feels as if we are leaping to that response before we have actually explored more bottom‑up ways.  Jeanette, what is your view?  

>> Jeanette Hofmann: When it comes to what we should do.  The digital service act, paragraph 40, it is for regulators, vital for me, researcher's access to data from platforms.  In the area of what is called systemic risks caused by platforms.

I'm not so happy about this restriction that we have to sort of mobilize systemic risk as a concept to get access.  But the fact that we will get access to data that help us understanding in a better more evidence‑based way of what disinformation does also how many people it actually reaches.  We don't know much of that, right?  We have only vague data from the U.S. and Europe saying it is sort of between 0.4 and 5% of the people of the users of platforms actually get to see disinformation.  There are better times lying ahead of us with getting access.

And I hope that the IGF could be a mechanism where we can sort of provide Best Practices of how to actually implement this Article 40.  And I hope small jurisdictions will pick it up and say here, you are giving people access in Europe.  Why don't you give us access in other countries?  IGF could it be a way to sort of discuss this. 

>> ANRIETTE ESTERHUYSEN: Yes, it is important to note that this access to data is not available to researchers in the Philippines or in Brazil.  Or in ‑‑ other instances where there is a need to look at the impact and how disinformation informations.

Aaron, what is your view on this?  On how we develop the responses?  And particularly I would like you to respond on the role of Government.  Because we are in a moment where Governments are taking this seriously.  I think sometimes with good intention.  Not always with bad intentions. 

>> Aaron Maniam: Thanks for that.  I broadly agree with the thrust of the discussion, that the work needs to be polycentric.  Not just multiple people involved but leadership and decision capacity in multiple centers of power and influence.

That means that Governments move from just having an authority based role to having a convening function.  It means Governments bring together the right players, Governments have to acquire skills, in and contexts to distill itself.

In Singapore we're doing deliberatative work for the bottom‑up democracy we are calling for in Singapore.  The most recent incarnation.  Looking to build a future identity in the country.  During COVID a process of thinking about recovering and emerging stronger together as well.  Something we have been experimenting with for a good part of the past 20 years, how we build this bottom‑up set of priorities and issues.

As kind of a side note to that, but very important side note, I think where Government itself is thinking about its role in relation to the Private Sector and Civil Society, it is really important to harness the range of roles.  Not just the regulatory functions and law making functions that we have.  But also the possibility of Governments in community building, in space building.  In urban planning.  A lot of the solutions might lie in there.  One thing in my last job for instance.  The three agencies that work together with the ministry in operationalizing strategy, the information and media development authority, the cybersecurity Agency and national library Board.  We have a Board that looks after all of the national libraries.  It is interesting.  You think about it we talked about regulation and the dark aspects of security and how to respond to that.  But haven't talked about the literacy our citizens need in detail.  If you ask me, schools and libraries are as important players in the process as the more regulatory counterparts they might have in Government.  This isn't obvious to us.  This is where the patience comes in.  It takes time to build up literacy.  That is where we need to start.  I look at my one year old‑year‑old nephew that knows how to swipe on an iPad before he can talk.  His literacies will be different from ours, over time.  We need to bring schools, families into the issues that is where the core issues and moral sensibilities will be lean.

>> ANRIETTE ESTERHUYSEN: We have three minutes left.  And I will come to the panel for their takeaways.  Clara, keep it brief, please.

>> Clara Iglesias Keller: I will second Jeanette on the call for more empirical evidence, especially outside of the Global North.  I will stick to Brazil to tell you that we all know Latin American countries have another sort of relationship with messaging apps for instance.  It is aggravated by zero rating policies, where the huge part of the Internet is WhatsApp, Facebook, Instagram and we need more data to understand how all the political social contingency we can see play into the way it behaves.

I think we need more institutional innovation in the sense of among all of the things to fight disinformation, there is a Government, but there is this one thing that is disputing the truth.  Disputing content and that should definitely be with Civil Society.  It should be with different agents, I think we need the institutional apparatus to allow for that to happen.

>> ANRIETTE ESTERHUYSEN: Clara has done what she wants to see and doesn't want to see.  Nighat Dad, what about you? 

>> Nighat Dad: Anriette Esterhuysen to go back to what you said.  Sometimes you want to get into other solutions.  I feel like the ones we have we need to look into the nuance that those solutions have established, of course, like it or not.  Some solutions ‑‑ all solutions have pros and cons.  But to your point about the consultation, one thing I have learned, sitting at the Board is we're talking about context a lot.  We really don't know how to extract that context when we are deciding about something.  At the Board, what we have done is speak into Civil Society from that particular Region, you know, where we have selected that case from.  I think that is the kind of process that we need.  We need more Global oversight.  We don't need only one.

Because if regulation is doing something from the state perspective we need the Boards to hold the companies accountable and do their work.  Our transparency Boards are there, our decisions are there.  I think those giver give really good data points to hold the platforms more accountable. 

>> ANRIETTE ESTERHUYSEN: Thanks, Nighat Dad.  David? 

>> David Kaye: I really appreciate Paul Ash made an intervention.  I think that his leadership and New Zealand's leadership and the role ever multistakeholder approaches even in the wake ‑‑ let's remember, the Christ Church call is ‑‑ it could have gone the other direction.  It was a moment of real trauma.  And people have a kind of natural response as we see now in Israel and Palestine.  A natural response to adopt ideas that are not rooted in human rights.  And the Christ Church call ‑‑ Paul's stewardship of that had that at its core.  I think we need to find ways to ensure that that remains at the core of the Global Governance.  IGF has not always been a place where human rights and access to information is front and center.

But it can be.  I think there are a lot of people in this room or on this Zoom who believe in that.  And there are models that we can draw from as we move it forward.  Right thanks, David.  Bill and Jeanette, you will have the last word.  I want you to say the one thing you would like to see taking this forward and the one thing you don't want to see.  Be brief.

>> Jeanette Hofmann: One thing, I will start with the negative.  The one thing I don't want to see is that Governments use this sort of Global concern on disinformation as an excuse for regulating their people speaking up in public during a time where for the first time they can actually speak up.  It is so important to support people and giving their opinion and exchanging their views and not sort of primarily thinking of it in terms of regulating.  That is the negative.

The positive, I have always been an Internet researcher with the focus on the Internet.  Over the last months I begin to grasp the importance of high quality journalism.  I want everybody to have the right to claim that the earth is flat as long as there is good journalism that depicts the globe, right?  That shows and talks about it.  So everybody can disseminate crap!  Because it doesn't matter.  It doesn't do harm to society as such.  Good journalism and business models that are also robust for the future that comes for the young generation doesn't subscribe to newspapers anymore, but use social networks.  We need to prepare for the time that young generation doesn't pay anymore, but depends on high quality journalism.

>> ANRIETTE ESTERHUYSEN: Thanks, Nighat Dad is thanks you for saying that.  Bill, you have the last word before me.

>> William Drake: I would like the European Union to fine Elon Musk substantially.  They have the capacity under the Digital Services Act.  And their guidelines.  He has told them screw you.  I don't care what you think and dropped out.  I think Government is the main focus, we have to do stuff on the platforms.  And making it hurt financially, especially vis‑à‑vis the advertisers is a good way to start.

>> ANRIETTE ESTERHUYSEN: Thanks very much.  I'm sorry we went over time.  I'm sorry we didn't have time to come back to the participants again.  Thank you to the panelists, participants, and tech team.  And enjoy the rest of your IGF.  Let's continue using the IGF to get to the problems and unpack the misinformation about misinformation.  Thanks, everyone. 

(Applause)