The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> MAN HEI SIU: Okay. So I guess we'll start now.
Sorry. Sorry, sorry. There's a bit of technical problem.
Shadrach, can I ask, why am I hearing my voice twice?
>> You are hearing your voice?
>> MAN HEI SIU: Yes, aim hearing my voice repeat.
>> It might be that maybe ‑‑ do you have two devices connected?
>> ALICE ECHTERMANN: It can be you don't have everybody muted. Maybe somebody else has you on speaker.
>> MAN HEI SIU: Maybe. Maybe.
Could we mute everyone then? Shadrach, can you help? Thank you.
Okay. I'm still hearing my voice repeat. I'm very sorry.
>> MARTHA MAI HATCH: We should explain to our panelists, so basically that we are just the organizer. We are not host, so now it's the UN that they are the host. So I feel like these are technicalities, like the UN should do something about it now.
>> Please, is anyone else hearing an echo besides Connie?
Please, can someone try to speak so I can see if anyone else is having the same issue?
>> ALICE ECHTERMANN: Yes, I can test. I don't hear an echo.
>> JENS KAESSNER: Test. I don't hear an echo either.
>> Thank you. It might be from your end, Connie.
>> CHARLES MOK: I think I'm okay.
>> Okay. So can you try again and see if ‑‑
>> MAN HEI SIU: For me, there's still an echo.
>> CARINA BIRARDA: Yes, for me too.
>> Okay. So then we'll have to ‑‑
>> I you have an issue from your end.
>> MAN HEI SIU: I think Ms. Carina Birarda said she had an echo as well. I guess I can try to endure an echo because I don't want to delay it too long because I'm hearing as well right now again.
>> CARINA BIRARDA: Now it's okay.
>> MAN HEI SIU: Okay. Okay.
Okay. Very sorry for the delay. Thank you, Leonie for sharing the PowerPoint. Welcome everyone to workshop 97, fact checking a realm of multi‑stakeholder model.
It was in a way unexpected for the three of us with various backgrounds to come together and organize this workshop. And it all started when we begun to see the rising amount of false information being spread online and it got even more severe when the pandemic began. Creating an infodemic. We wanted to explore the disinformation, and the misinformation with the amazing panelists we have today with us, and hopefully this will be a fruitful and sun session for everyone.
My name is Connie Siu, I'm a 19‑year‑old biomechanic engineering student from the Chinese University of Hong Kong. And we have Martha Mai Hatch, with the City University of Hong Kong, and we have Leonie Kellerhof with the London School of Economics. This is first time to organize a workshop here at the UN IGF. I would like to give a huge thank you to the emails from three young people to participate in this workshop.
For some of you, it will be very, very early, especially Mr. Obed and Ms. Birarda. I will introduce the panelists we have with us today. I will not attempt to give us a detailed introduction to mention all of their details and achievements. You get to hear more about them and their work throughout the session.
Leonie, could you move to the next slide, please.
Thank you. From the government sector, we have Mr. Charles Mok from Hong Kong, which is a legislative council representing the information technology functional constituency and he's been serving the ICT industry for over 20 years, and is a founding chairman of the Internet Society Hong Kong.
We also have Mr. Jens Kaessner from Switzerland who has been working on the digital transformation of Switzerland for almost 20 years, including coauthoring laws and policies covering net neutrality, Internet security, Spam, data protection and much more.
And between 2011 and 2017, he has also coauthored two reports of the Swiss government on social media.
And from the civil society sector, we have Ms. Alice Etchermann from Germany, who is currently a 28‑year‑old journalist working for corrective fact check and investigates the spread of rumors and false information. We also have Mr. Sindy Obed, and the president of Internet Society Haiti.
And last but definitely not least from the technical sector we have Ms. Carina Birarda has research and development communication technologies and also information security.
And a few words of housekeeping before we move on to the capacity sharing session, this workshop will last 90 minutes with around 40 minutes for the sharing session and around 30 minutes for the panel discussion. There will be a Q&A section. Please leave a question in the Q&A window instead of the chat box. Please do indicate who the question is for or whether the question is for all the panelists in general.
There's realtime English captions available by clicking on to the closed captions button.
Martha will be helping me moderate and Leonie will be monitoring the chat, the Q&A box and live streams throughout the session.
And also for the panelists, I will most likely be addressing questions to a certain panelist or two panelists but feel free to jump in at any time you have anything to add on to, or if you have any further comments to the questions.
So without further ado, I will begin this workshop the capacity sharing session and we look forward to the questions that will be coming from the audience, from you all.
So each panelist will have around 5 to 10 minutes for sharing and I'm sure that Martha, Leonie, I myself and everyone else are looking forward to everything you will be sharing with us right now.
So let's start it off according to the order of the panelists here on this PowerPoint, with Mr. Charles Mok first. So Mr. Charles Mok.
>> CHARLES MOK: Okay. I unmuted myself. Can you hear me?
>> MAN HEI SIU: Yes, loud and clear.
>> CHARLES MOK: Okay. All right. So I will try to keep myself within the time limit. First of all, I want to say that although I was introduced as, you know somebody from the government sector, really I don't work for the government.
I don't work in the government. I am a legislator and I happen even to ‑‑ I even happen to be a legislator on the opposition. So actually, I have no power. I am not a part of the government, even though, you know, technically some might consider us that.
My background is mostly spending time in the industry. I started international service provider longtime ago, probably 25 years ago in Hong Kong and since then also have been working with civil society, with, are you know, starting ISOC in Hong Kong and all of that.
Back to the topic about misinformation, and fact checking. Well, you know, we have just had the great experience and lessons that we have seen in the last couple of weeks with the US elections and I think those misinformation being thrown around is probably still, you know, something that we can still witness within ‑‑ you know, with our eyes every day. And it will probably carry on for another couple of months.
People have been saying a lot of disinformation, misinformation having a great impact and a negative impact on democracies, but on the other hand, you know, even ‑‑ we're not talking about the election. Obviously, with COVID‑19 and so on, there's been a lot of disinformation being thrown around. This is not just an issue with the western world. Obviously, we each had our own problems. In Asia being you can talk on and on about the problems we have in Hong Kong, but obviously internationally we also see problems in the Philippines and Myanmar and so on. Lots of these issues with disinformation.
Now, I started thinking about this topic, you know, through thinking about what is ‑‑ are we talking about the how to ‑‑ how to help set up fact checkers and how do we establish the credibility of the fact checkers? And then very soon, I find that the problem is not only with, you know, the resources and the problems and difficulties of setting up an objective fact checking operation, the problem is sometimes and a lot of times, especially when we are looking at what has been happening in the US recently, do people really want to see facts being checked? Do they really want to know the objective fact? Or are they just looking for alternative facts that they like to see?
So if that's true, then it doesn't really matter whether or not we have a great fact checking organization or not, because these days, you know, people look at the media, and they group them and say that this is red, this is blue, and this is whatever, you know on this side, on that side, right? Maybe more than two sides, in some cases.
But increasingly, are people even looking at fact checkers and saying that, you know, they are not on my side. They are ‑‑ so they are not objective.
Actually, people ‑‑ actually, people won't even use the word "objective" anymore, or I'm generalizing. I'm not talking about all people. But more and more people, I have seen ‑‑ I worry, it's beginning to not really have a desire to look for objective facts. They only look for facts that support their own position.
I think that might be ultimately a very, very difficult situation for those would want to do well and do an objective and thorough job of fact checking.
Obviously, you know, in the last couple of years in Hong Kong or in Asia, that I have seen, there are increasingly many, many being fact checking operations that have been set up, and these organizations, of course, they sometimes are founded by activists, and sometimes they are founded by journalists or academics. They each have their own probably short comments or advantages and so on. But I think some of the challenges they share would be limited resources and the difficulty of setting up a ‑‑ an operation that would be able to take care of these requirements or needs or misinformation, that they have to take care of in a timely manner, because if you don't take care of it in a quick and timely manner, it's probably useless.
And in some cases, they also lack experience. Also, cultural differences, language differences. That's also huge ‑‑ a huge problem that sometimes these organizations face and increasing political pressure.
I think in some governments whether you are media or a fact checker, you are actually facing a lot of potential or even possible persecution from the government or other forces in many of these Asian countries and increasingly probably in Hong Kong as well.
But, of course, I think one scam. That in Asia, I think we have seen a ‑‑ a sort of better example of doing fact checking, might be Taiwan, because over the last election in Taiwan, in the very early part of this year, in January, in fact the government was very proactive in terms of being open and providing rapid response to misinformation, different government agencies, departments. This he have sort of a ruse that within ‑‑ I don't remember within two hours or five hours, they have to give a response if they see ‑‑ if they are flagged with misinformation that falls within their jurisdiction. They work very closely with some of the large social media.
Like it or not, sometimes you have to work with them, because they are the purveyor of most of this disinformation, like Facebook and Twitter and so on. Alongside with multiple community fact checking organizations.
So I don't have time to go more deep lye into that, but we also use chat bot online to receive reports and so on, some AI elements added into it. So I think in some ways, they are a good example of what can be done for some governments, but unfortunately, what we see is that increasingly, there are other governments that are trying to use legislations to try to label something that they don't like as fake news.
Now this has nothing to do with fact checking. And this is like government coming in and saying this is not ‑‑ this is not the news that I want to see, and they are fake news. You know, like what Donald Trump is saying and that right now. You know, CNN is fake news. I'm not saying that they are or they are not, but for government to say that, without objective proof is a big problem.
And anyway, so governments in Asia, such as Singapore, they have passed laws to combat fake news and increasingly in Hong Kong, you know, our government and the pro government legislators keep saying there's too much fake news. Anything that is against the government is fake news.
So this is the problem that we see also in many parts of Asia. So I just want to close with some of these observations and I don't want to take up too much time because we have a lot of great speakers coming up. So how do we play our part as somebody who is on the Internet as a user or different role that we are playing to ensure the trust and the transparency and the accountability?
You know, are we ourselves looking for objectivity? I even have to ask myself, are we ourselves looking for real facts or are we just looking for things that we like to hear. I think that's the biggest problem, question that we have to ask each and every Internet users. Thank you.
>> MAN HEI SIU: Thank you, Mr. Mok for your sharing. I'm sure those are some of the great points that we will be discussing later in our roundtable discussion, which is in about 30 minutes or so.
And so the next speaker that we have today is Ms. Alice Echtermann the stage is yours. You can share your screen.
>> ALICE ECHTERMANN: Thank you for organizing this session. Yes, I will quickly share my screen. Just a second.
Okay. So can you see my screen?
Okay. Perfect.
So as already mentioned, I'm a journalist. So today, I'm speaking from journalistic perspective, mainly, and a fact checker, of course, which is sometimes confused because you know, fact checking ask be different things, we define ourselves as fact checkers, that check things that others said. Some journalistic functions have fact checks who check their own articles. We do external fact checking, if you want to put it like that.
Our organization is called Correctiv and the subline in German here, "investigation for society." And so Correctiv is a nonprofit. And it's a unit inside this.
So we fight misinformation and disinformation online. I will give you a brief overview, just about our work, our principles, yeah, how we do this.
So Correctiv as a whole has 40 persons roughly at two locations and just now speaking about the fact checking team, that's the whole organization. As I said it's nonprofit investigative journalism. We are funded by donors and have two locations in Berlin as the main editorial office and the headquarters is in Essen. We also work with international journalists in Europe mainly and we have some colleagues in Spain and Italy and in France.
And Correctiv.Faktencheck, I'm currently one the team leads. I joined Correctiv.Faktencheck last year in May, and before that, I had to ‑‑ like a classic journalistic career, working for a newspaper, as an online editor.
So what are we doing at Correctiv.Faktencheck? We are working according to the code of principles of the international fact checking network, the IFCN, which is an institution of the Poynter Institute. I will come back to that later, what that means, what this code of principles is.
We investigate viral rumors that spread online, mainly, of course, on social media, like Facebook or Twitter, but also WhatsApp is playing an increasing role, and Telegram.
We are focusing on viral rumors. We think in we focus on a topic that is not very broadly shared on social networks, then we will even make it bigger when we address it. So we just look at the big topics. This can be photos, videos, news articles, quotes of politicians. Basically, everything that contains the factual claim. You can only fact collect factual claim. You cannot fact check opinion. You can also not fact check conspiracy theories, which means that yeah, if somebody says there are evil forces in the background trying to establish a new world order, you cannot fact check that. That's not possible.
And also, we cannot fact check future forecasts, for example. Objection, we don't have a glass ball to look into. So we published our fact checking articles on our own website which is Correctiv.org. We give inside our articles ratings from correct to bogus which means it's totally invented. And this is like the rating we give to the original content. We see, for example, a photo that was digitally edited by Photoshop, which is then, of course, totally false. But sometimes also you have claims or news articles that are a mixture. It's not always everything that everything is clear or correct and they mix actual facts with something that is not proven or some conclusions that you cannot draw from, for example, statistics. Statistics which are real, but you cannot draw the conclusion that people do. And then we try to give this very nuanced ratings to just be fair and open which is right and which is wrong.
So we also make transparent our ways of investigation and our sources. This is very important for us, because we think if our leaders can see what we did and see which sources we used, only then they can trust us and then they trust our rating, if we say something is false.
The code of principles that I mentioned, it's basically ‑‑ yeah. It's a bit hard to explain if you never heard about that before. So we have the international fact checking network. We have fact checking organizations. Currently, there are 83 signatories, and the codes of principles means that there's some basic rules that all of these signatories commit to. This is a commitment to nonpartisanship and fairness which means that you fact check all political sides. You don't only focus on what you would call the right wing political side or the left wing, you just fact check everyone to the same rules.
You have standards and transparency of sources, that you say what you used. Which sources you used. Transparency of funding and organizations. So on our website, you will find all the information, how Correctiv is funded with the donors, the main donors and the monthly and the quarterly reports.
We have standards and transparency of methodology, which is also how we investigate our, who we talk to and how we give our ratings and we explain how we come to conclusions.
And also open and honest policy of corrections. So if we make mistakes ‑‑ of course everybody can make mistakes. Journalists do that like all humans. We have to correct ourselves and we have to make it open and transparent and say, okay, we made a mistake here.
The verification process of the IFCN is renewed every year. And you have an examination of the fact checkers every year if you still comply with these rules and only then you get this, like, certificate.
This is important for us, just to say we have certain standards. We did not invent these standards, but this is like globally all fact checkers in this organization apply to these rules. You can compare.
Also, this is linked to something that we ‑‑ that is called the third‑party fact checking partner of Facebook. So we are a partner of Facebook, since 2017. So 2016 Facebook started with these kind of partnerships. They partnered with independent fact checkers like us, worldwide, who are verified signatories of the IFCN code of principles. These are, for example, AFP, Full Fact in the UK, Africa Check, Rappler in the Philippines. There are so many. And what we do is ‑‑ or what Facebook's program encompasses is we are tagging false information on Facebook and Instagram with warning signs. So if we check a Facebook post that contains a factual claim and we say this claim is factually false, then we can put a warning sign. People see then, okay, here this content is false or partly false.
Which also means Facebook is reducing the reach of this content. It gives a link on our website and they can read what we investigated about this claim and they can read our sources and make up their mind. If they still want to share the content after that, they can do that, but we are trying to give the context. We give them education, so they cannot share something without knowing that they shared something false.
The false content will not be deleted, which is very important. People often misunderstand that. It's not deleted by us and not by Facebook because of the fact check.
Last thing I would like to point out, what we think is important to fight disinformation online. So we are ‑‑ we are facing many, many challenges which is like was already mentioned, the capacities, we are only eight journalists and we are only covering Germany, but this is a huge amount of disinformation now in our country, and we have to be fast but still have to be correct which is very difficult. We also face legal threats by people who don't like our fact checks which is not by the government in our case, but by the spreaders of disinformation, who are sometimes very big companies, actually, behind some media outlets. They, of course, don't like what we do.
We think it's very important to have education of the dangers of disinformation for citizens, which is basically media literacy that, yeah, it is ‑‑ it is education schools, basically.
You also need knowledge about journalist and trust journalism which means journalists have to comply to standards. They have to do their job correctly. And we need more transparency by platforms, which means Facebook, Google, Twitter, TikTok, they need more insights, scientific research. How things like photo bubbles and echo chambers and how the algorithms work. Actually nobody knows how the algorithm of Facebook works, but we know that it favors disinformation. Facebook is trying to do something about it, but still not saying how the algorithm works. So we need transparency.
That was everything from me, for so short. And really looking forward for the discussion and, yeah, giving back to you.
>> MAN HEI SIU: Thank you, Ms. Echtermann. It's nice to know how an independent fact checking network works and especially how your organization works during the fact checking process. So I guess we'll discuss about it more during the discussion.
So moving on, we have Mr. Jens Kaessner. So Mr. Jens Kaessner, it's now yours.
>> JENS KAESSNER: Thank you. Now, before social media, we had the media that did the fact checking, but that's not the case anymore. So today we have social media where anybody can reach millions of people and we now absolutely need fact checking, even if some people might not want this, societies need this.
But fact checking comes after the content was sent. So if you like the fact checker is a bit like a goalkeeper in a soccer game. He can't win the game alone. He's the last man standing, and he needs 10 other players before him. And I would like to quickly speak about these ten players. The social media platforms maximize for engagement of the users and for time the users spend. So they are designed to keep as many users online for as long as possible.
How do they do that? With polarization, controversy, hate, partisanship, negativity, extraism, tribalism and, surprise, because facts that are not true are always a surprise.
So before the fact checking, we simply have too much content that evokes these feelings and that can destroy a country, a society. The platforms need to change this so that boring but useful content gets enough exposure, as much as inflammatory content or in a word truth must spread as well as lies.
Now, the platforms know this, but they need a nudge to change. Because the economic incentive keeps them from solving this problem themselves. And in the end, they must amplify truth, as well as they do lies. They must adapt their algorithms for that. Because we have billions of messages every day, content created in the billions and only automated solutions can deal with billions of messages, and it's only when this has been done that the fact checker can do good work. We need the ten field players on the soccer field so that the goalkeeper, the fact checker, can really protect the goal.
So with that frame given, yes, we absolutely do need fact checking. It's ‑‑ it will have to be established worldwide in all countries. It will have to be established on all platforms, video sharing platforms, messenger platforms, you name it. What is happening at the moment is by far not enough. And fact checking, in order to be successful needs financing, the fact checking ‑‑ the fact checkers need much more money than they currently get.
Now, things are starting to happen. Networks of fact checkers are starting to form. Ms. Echtermann mentioned the IFCN, and there's the European Digital Media Observatory, the EDMO that was established this June for Europe. We need these networks and multi‑stakeholder models can work. The multi‑stakeholder model is a good model for fact checking.
The only thing we need to be careful about is that as Mr. Mok already mentioned, some stakeholders actively support disinformation, and they want the lies to be successful. So what is important there is that the principles of science and of journalism are at the basis of the fact checking process. So you need to check what the sources say, go back to the source. You need to check content with reality.
So if you like, you need to call that ‑‑ you need to do research, but mind you, that's not research in the perverted way of people going, ah, I researched on YouTube. That's not what I'm talking about. I'm talking about real scientific research.
And last of all, we need to protect the fact checkers. They are doing an enormous service to society and they are under pressure and in danger and if we want our societies to succeed, we need the fact checkers to survive. Thank you.
>> MAN HEI SIU: Thank you Mr. Kaessner for your sharing. I'm sure with your analogy of, like, soccer players, I'm sure many audience could have a better concept of how this fact checking is supposed to work. Especially I know in the participants, there are a number of teenagers here with us today. Thank you very much for your sharing and I guess we will move on to Ms. Carina Birarda.
>> CARINA BIRARDA: Yes, here. I can show my screen?
>> MAN HEI SIU: Yes, you can.
>> CARINA BIRARDA: Yes, thank you.
Wait me, please.
Oops.
And okay. I share my screen.
Here.
I have made a brief presentation with the concepts. It's okay?
You can see my screen?
>> MAN HEI SIU: Yes, clearly.
>> CARINA BIRARDA: Perfect.
Okay. Thank you very much to the organizers IGF and United Nations for this workshop.
Something about me, I'm Carina Birarda from Argentina. My current shop is at cybersecurity.
To begin, I will be covering a few definitions of key concepts, fake news, misinformation, disinformation, and information disorder.
The term fake news is recent. They have many uses of the term "fake news" even fake media, described reporting which the crowd does not agree. They begin searching for the term extensively in the second half of 2016, but we will see the term is commonly wrong applied.
For example, another way for explaining the scale of information corruption. And then the term fast becomes so programmatic, I think we had to start to avoid it in the wrong way.
So we talk about false information or false content? Do you know the difference between misinformation and disinformation? Misinformation is information that is false, and the person who is disseminating believes that it is true.
Disinformation is information that is false, again, and the people and the person who is disseminating it knows that it's false. It is a deliberate and intentional lie.
Types of information disorders. Much of the content of the fake news combined two notions, misinformation and disinformation.
False information. Disinformation is being formed by disinformation. They call it malinformation, information that is based on reality but is just to inflict on a country. An example is a person's sexual orientation without authorization. Such misinformation like true information, but violates a person's policy and goes against the standards and ethics.
Everybody knows false information existed since humans have told stories.
Fact checkers, traditional or analogic. In the information process prior to the Internet, before publication of content, the information control process was carried out.
Since the arrival of the Internet, everyone can publish. Everyone can amplify. Anyone can edit. Editors are gone and there's no trashcan.
Internet age, new process. This information, misinformation today, what is different? The false information is spread more quickly. It's the speed of a click. It's difficult to control the speed of the news spread, intentional fake information basic in opinion. Intentional disinformation, lack of fact checkings. AI‑generated news site ranking.
Okay. What is true? What is false?
There is an unintentionally misleading content spread with the affecting citizens really perception and affect it's their trust and their sense of environment, mutual consent and participation.
The legal context, the limitations are to be enforced by law. The possible balance between the two is made ‑‑ is met by promoting responsibility and in combined information to end users, from providing and empowering users.
So empowering users, there's many items published on the Internet to empower users. I would like to briefly comment one.
The message I wanted to share with you, is called AMOR. In English, love. This was created by friend and colleague from Argentina, Maximiliano Macedo.
A, is the author's name missing. Is the author's name listed?
M, message. Is the message, reference, and names, or objectives. That ‑‑ are they only showing one side of the argument, or report/accountability? Is it real?
It's not like an easy word to remember. We need simple methodologies that can be related with casual words as love, amor, and validate the information around us. Love is a way of approaching to a person from comfortable perspective. And this is like there are some words and some concepts.
This is all my contact and thank you very much for your attention. Please feel free to contact me if you need it. On the last page, there are all information and services and useful sites from our point of view.
>> MAN HEI SIU: Thank you so much, Ms. Birarda. Thank you for giving a clear explanation is misinformation, and I'm sure everyone knows a new span word for love, amor. It's a new method for them to remember fact checking.
>> CARINA BIRARDA: You're welcome.
>> MAN HEI SIU: And last but least we have Mr. Sindy Obed. The stage is yours.
>> OBED SINDY: Thank you. Thank you. Can I share my presentation?
>> MAN HEI SIU: You can share your screen, yes.
>> OBED SINDY: Sorry, I will not the ‑‑ sorry, it's early now.
I will share my screen.
Okay.
It ‑‑ let me stop the ‑‑ start without my screen. So I ‑‑ I try. So I'm Obed Sindy, President of Internet Society Haiti Chapter. I'm also a member of a ‑‑ a board member of LAC Law board member. It's part of ICANN. We ‑‑ we defend the end users and Internet end users, and I'm cofounder of a grassroots. So we promote the engagement for young ‑‑ in Internet Governance space at GTEL, at the grassroots level. So recently, I'm coordinator of Haiti IGF, since 2018, and in ‑‑ and in '18, IGF, in ‑‑ in 2019, we discuss on how to combat fake news in '18. So we have awesome international panelists and national panelists, so we come with the conclusion that how journalists maintain ‑‑ defend in the information, and prevent the later consideration qualified to the positions of archives. So we need ‑‑ the responsible media, the responsible media, and we work with the treatment of information. So we also conclude that the ‑‑ any question to information of responsible behavior of the end users is the key. So education in these ‑‑ in this phase is also the key to help combat the ‑‑ combat fake news.
So information ‑‑ information should be processed with more as if they are talking with a stranger, and it should be supported.
So in Haiti, we have two initiative related to fact checking bloggers network and we have a ‑‑ the Asian ‑‑ the T‑check AT and initiative of the Asian Association of Online Media.
So let me say something about the first project. The first project is Toma. Toma is a project is liberty of information published on the Asian webisphere. They submit items for analysis. After a few seconds, Toma suggests what questions to take from him, and whether they have the ability or more advanced results, about fake news or confirmed information. The user can also share the results proposed by Toma via email to his friends or follow the information, one that it is confirmed.
They allowed it to learn quickly and improve automatically. So it's a good project. So the Asian ‑‑ the Asian blogger network needs more ‑‑ more people at this ‑‑ on this project to support it, to see how we can in the mid‑sequence perspective. And so it will be very, very interesting.
And the other thing is T‑check. This is to analyze the objectivity of the information circulated on the web and the traditional media in order to verify whether the facts are not exploited, presented in ‑‑ in order to serve partisan interests or to concern conflicts. So this is the second project. So these two projects to see how we can find fake news in AT, but what are ‑‑ I think that the ‑‑ like, as I said before, so education is the key. So this is not too popular in AT, and this project is ‑‑ need more funding to continue these works in. AT to fight fake news.
So we know that in AT, it was fair where we have traditional media. We have new media, and sometimes partisan media, we have a politician, we have a ‑‑ someone in the ‑‑ there's someone in power. So we need an independent ‑‑ we need the fact checking. We need more fund to support the new journalism, the independent journalists to move forward in a good way, where we will have a new generation of users of ‑‑ of user for Internet in ‑‑ all who want to receive information, so that it's ‑‑ the ‑‑ the ‑‑ there's no need now.
So I cannot share my screen on my phone now, so it ‑‑ but I think we are ‑‑ we have question in the audience, we will ‑‑ we will ‑‑ I will participate in this interactive way. Thank you. Thank you so much.
>> MAN HEI SIU: Thank you, Mr. Obed, and thank you so much for sharing your projects and if it's possible since you couldn't share your PowerPoint, you can send it to us and we can send it to the other panelists and the attendees here by pasting it in our proposed website. I think that would not be a waste of effort.
So thank you for your informative changing on disinformation and misinformation and fact checking. I will start off with Ms. Echtermann, and Mr. Obed, and you have talked about raising awareness with regard to misinformation, disinformation and fact checking. So in your opinion which stakeholder should take up the most responsibility to achieving this? And could you propose any method of framework to educate users?
>> ALICE ECHTERMANN: I can start. This is a very difficult question, because I think there's not one person or organization who can actually do this task. It's like the civil society in every country has to work together with this. I can only talk about Germany.
In Germany, I think our teachers and schools and so basically, above this education politics have ‑‑ in the past one or two years, begun to understand how important it is to educate young people about false information on social media platforms. So we receive more and more requests by schools, actually, who would like us to do workshops for their children.
The problem is that we as fact checkers, don't have the capacities to go to all schools in Germany. And so we usually have to say no to these requests which is very sad. In Germany, we are lacking experts about this topic. So people turn to journalists which is the role of the state and the educational system.
In Germany, every child has to go to school. You have certain policies. All schools teach basically according to this same ‑‑ yeah, teaching rules ‑‑ I don't know how to explain it in English, actually, it's like educational plan for every, like, age. And they just have to include media literacy into this educational plan.
And they have to have experts as teachers in schools like employ teachers and not external fact checkers, journalists come to school and give a workshop, maybe one workshop in one school in the whole Germany in one year. This is not enough.
So we should teach teachers how to give the knowledge to their ‑‑ to their students about fighting disinformation.
Thank you, Ms. Echtermann, Ms. Birarda or Mr. Obed, do you have anything to add?
>> CARINA BIRARDA: From my point of view, I believe that it's necessary that society must be concerned in this systemic way, and in methodology and resources to be able to carry out its old personal and individual fact checker.
Without the sources, without the information of another fact checkers, maintaining a critical thinking, education and media for these efforts and user empowerment and multiple stakeholders, corporations, appear to be the best way to improve the quality of information that is spread online.
>> MAN HEI SIU: Thank you. And Mr. Obed, do you have anything to add on to that?
>> OBED SINDY: So I think that all the stakeholders must be counted. So I think the ‑‑ for me, the civil society have quick role to play in this. So to put more fact checking because in the private sector, they have businesses. So they have ‑‑ they have ‑‑ on their control, they have all the new media, or the traditional media. So we have to ‑‑ we have to see how we can make up ‑‑ we can have all the stakeholders on site, how to have NGO so to ‑‑ to fund a fact checker, and how to have the technical community, the academic community. And the most important for me is the users ‑‑ the user must be ‑‑ must be ‑‑ understand that we live in the new ‑‑ the new world transformed with all ‑‑ by digital. So we have ‑‑ we believe in a digital society with information flow. So the user must be trained. So thank you.
>> MAN HEI SIU: Thank you, Mr. Obed and to the other two panels for your answers. So Ms. Etchermann has mentioned the lack of funding for the organizations and for Mr. Charles Mok and Mr. Jens Kaessner, can you propose any way in which governments can help in this situation?
>> CHARLES MOK: Well, it depends on which government. I think if it's the Germany government, maybe they can give these fact checkers some resources and that would be a nice thing. That would be last thing I want to see whether it's for Hong Kong or Chinese government. They are part of the problem. They are the kind of government that may actually end up, as I said legislating to label news that are criticizing the government as misinformation.
So I wouldn't necessarily look to the government for help there, unfortunately. I would rather see hopefully efforts within the civil society to try to educate and to try to work with some of these commercial companies, even like Facebook and so on to try to rectify the problems, rather than relying on of the go.
I mean, I would be the last to trust of the governments, unless it's a totally really democratic government.
>> JENS KAESSNER: Thank you. The social media companies have become some of the world's biggest companies in last years, very quickly so, and, in fact, practically without paying taxes.
So I guess from the billions of dollars that they make every year, they should put some of that into financing fact checkers, external, independent fact checkers.
>> MAN HEI SIU: Okay. Thank you, Mr. Mok and Mr. Kaessner for your answers. I think that's a quite feasible method. Hopefully it could be implemented in the future.
And also to Mr. Kaessner, you mentioned that a global fact checking network is necessary in order to combat the global spread of misinformation and disinformation. So in your opinion, how important is the international level for fact checking in comparison to the national level of fact checking? And do you think that international cooperation and fact checking is achievable in, let's say ten years or 20 years?
>> JENS KAESSNER: It is achievable, absolutely. And hopefully it will be there in ten years. International cooperation is mostly about learning from each other. But fact checking organizations need to be locally rooted. It's only correct that can do fact checking in Germany. It's not a French or a Korea fact checking company would have big trouble doing the fact checks in Germany. So you need local fact checking but with international cooperation and I think that goal is attainable if the funding comes in. And we have ‑‑ to add, we have been seeing in the last years, as I said, international cooperation, Ms. Echtermann mentioned the organization in the US and this is currently in the works.
Thank you.
>> MAN HEI SIU: Thank you, Mr. Kaessner. Does any other panelist have anything to add on to the question?
If no, then I will move on. Okay.
So for this question, I want to ask Ms. Birarda. You mentioned that you are working in the cybersecurity sector, and so when people talk about mis or disinformation and fact checking, most people will connect them to mainstream media, but, in fact, this could actually affect many other industries, including the service security sector, as far as I know. So with the relatively recent exponential increase of mis and disinformation online, how has it affected cybersecurity from a global perspective or your local perspective in Argentina, and is there anything that the cybersecurity community has been doing to combat this?
>> CARINA BIRARDA: Okay let me answer. Give me five seconds. Of the community of cybersecurity, we work together to be strongly firm. I am ‑‑ I believe with Alice, they talk about we have to work together to ‑‑ for more quick fact checker activity. And cybersecurity, we are ‑‑ we are connected in all time for difference method and different messages and different platforms. And the information relative with vulnerability or something like that. If one person detects in another part of the world, and in ‑‑ and we can know about that information, and this is the deal for us, I think.
Maybe in fact checker this is the ‑‑ a good practices could be used.
>> MAN HEI SIU: Thank you, Ms. Birarda.
Now I want to direct the question to Mr. Mok. You have mentioned that there are certain stakeholders within the ‑‑ the misinformation community that they would like to, you know, promote more misinformation and disinformation. So in your opinion, how can stakeholders that support misinformation be included in this multi‑stakeholder fact checking process without ruining or affecting the integrity of this whole process?
>> CHARLES MOK: Okay. If I get your question right, let's say there are organizations that might be creating or purveying, trying to spread this misinformation and they might be doing it for different purposes. It might be political. It might be monetary.
It might be trying to cheat people money or whatever or ‑‑ and so on. Or spread hatred.
How do you engage them into the conversation, is that your question, right?
>> MAN HEI SIU: Yes. Yes.
>> CHARLES MOK: Is your question? I don't know. That would be difficult. A lot of times these organizations or these people tend to be ‑‑ they want to hide their tracks. And the problem ‑‑ I go back to the question of the purpose of spreading these misinformation. They spread the misinformation because they want to try to achieve a purpose. Now can we engage them and try to say, you know, don't do this and, you know, be a nice boy or whatever?
That might be difficult. I haven't really thought of that ‑‑ that aspect of it. I don't want to rule it out completely. I mean, but it's like all of these problems we're facing right now in the world. The world is, indeed being very ‑‑ is becoming very, very divisive. Everything ‑‑ if there's an election, maybe half the world will end up with 50/50% election, like what we saw in the US. And so it's difficult in that situation to tell people to sit down together and then compromise.
So I'm afraid that it's a very good question in, fact. We need to think more about it. At least try to ‑‑ as I said originally, try to instill or reinforce the importance of objectivity, so that people or users or even people who are trying to become their own KOLs, their own key opinion leaders are going to be able to at least respect objectivity, and ‑‑ and also for users to do the same.
I think that kind of education and value, reviewing of the value is important. Now who can do it? Can government do it? Can educational institutions do it? Can we reemphasize this importance? Maybe but there are a lot of authoritarian regimes in the world that will go in the opposite direction and that's difficult for some of these countries.
So it might be different for different jurisdiction, but it's a very good question. I need to think more about it.
>> MAN HEI SIU: Thank you, Mr. Mok. I'm wondering if Mr. Kaessner or Ms. Echtermann have anything to add, from the government perspective respectively.
>> ALICE ECHTERMANN: Would you repeat the question?
>> MAN HEI SIU: So Mr. Mok has mentioned right now that there are currently stakeholders within the misinformation community that are trying to promote misinformation and disinformation. So is it possible for us to include such stakeholders into this fact checking ecosystem, or model, such that ‑‑ but at the same time not affect the integrity of the entire process?
>> JENS KAESSNER: I can answer first maybe. In the US, one institution that spreads disinformation has been accepted as a fact checker, and that has proven pretty difficult. I think to speak generally, we need the principles of journalism and of science to be respected by all parties. So if somebody does not go back to the sources of a message, and somebody does not compare a claim to reality, if somebody does not base their actions on science, then that is not acceptable.
We also need processes to be open enough so that others can control them. But in general, there are very clever actors that profit from lying and from spreading hate and violence. So it will be difficult to incorporate them. I think the platforms should not accept any calls to violence at all, and that could help in trying to incorporate all entities into the process, but it will be difficult. Thank you.
>> ALICE ECHTERMANN: Yeah, I also think that it's not about taking everyone into the fact checking process which is also ‑‑ when I would imagine a government actor doing a fact checking, this is just against everything that I believe in, even in Germany where we don't experience threats by government or ‑‑ yeah, threats to journalists, to not do their job properly or have the freedom of expression. The problem if you have ‑‑ if you want to have trust and fact checking is that from our experience, many people who believe in false information and fall for disinformation, their problem is that they don't trust the state or anything that is attached to it.
And so in the end, it's ‑‑ it would be completely false to have fact checking by these actors. Even we experience as completely independent fact checkers. We experience people writing us very furious messages saying you are funded by the government which is not true. They are thinking because we are not supporting their world view, we have to be part of the evil system. In the end, everything you have to do to convince these people that you are a reputable source and that journalists are reliable sources when they work according to certain principles.
Many things that people consider fake news is actually just bad journalism and so in the end you have to educate people about what journalism and how they can recognize good journalism and see what is bad journalism, because actually in Germany, we many partisan media outlets that are spreading false information, but in a way that you cannot say, oh, this is just a very shady thing. They have professional structures. They have their own editorial staff. They have professional websites. They look like normal media outlets. So you cannot say you can rise on the first site this is false information. You have to educate people about journalism and you have to have journalists who really comply to journalistic ethics and it's a problem if journalists are speculating or jumping to conclusions too early. Yeah, that's not part of the job.
>> MAN HEI SIU: Thank you, Ms. Echtermann and Mr. Kaessners. So I guess we actually have more questions for you all but due to time constraints we only have ten minutes left.
We have to move on to the Q&A session. There are quite a lot of questions from the audience. So I will start with a question addressing no Mr. Charles Mok first.
So this is a student from Hong Kong. There may be warning songs that the post may contain authentic information due to barriers, we may tend to accept the information with stances similar to our presuppositions and the process of fact checking may be complicating those facts. It makes people less favored to check the information and so what can we do to have people have the information checked.
>> CHARLES MOK: This happens to us every day. We are Internet users. And we use these social media platforms. It's human nature, so unfortunately sometimes our fingers are faster than our brains. So I guess there's nothing really more than just telling, reminding ourselves as an average any Internet user, that whatever we see, whatever we of course when we share something, very likely we spread something. It's something that we like to share with others. Something more likely than not something that we agree with, right?
So that's ‑‑ so I guess we just have to remind ourselves that hey, you no he if it's too good to be true, maybe it's not true. So sometimes we just have to remind ourselves, you know, don't try to think like if I don't share it right away, I won't get as many likes as I may be able to get. You know, try to make sure that we are responsible user. I think from the user point of view, especially for young people like yourselves probably just reminding your several that importance. And like I said again and again, the value, reminding ourselves that it's important to keep a value of objectivity. And like Jens was saying respecting science and facts. Keep reminding ourselves when we are tempted by all of these temptations around the social media. That's the only thing I can think about. There's ample information on the Internet, if you really look for it to really try to even do our own little fact checking before we share. Obviously if the platforms are more helpful, that will be good.
>> MAN HEI SIU: Thank you, Mr. Mok. There's another question addressing to Mr. Sindy Obed. You said to consume information on the Internet as if you were a stranger on the street. Initiatives like fact checking robots can seem very foreign to people. How can we make people trust journalists and initiatives that they don't as opposed to blindly trusting people online? Mr. Obed.
>> OBED SINDY: Thank you. Thank you.
So we have to educate users. You have to educate users. So the source in journalist, the initial sources is very, very important. So we have many journalists in Haiti. But we have to verify what we have to check before trusting in all the journalists on the Internet. So we ‑‑ we saw that every ‑‑ every ‑‑ everyone is a content creator now because we have many influencers. The response regularly is every day about ‑‑ about the ‑‑ we have to influence them. We have to be careful about the information, and ‑‑ and we have ‑‑ social media is a public sphere, webisphere and so we have to trust before people. We know before and verified authenticity, because we can ‑‑ we can have service problems, like we have recently hate crimes ‑‑ hate crimes in Haiti about someone who attack a girl and was ‑‑ and was ‑‑ and this girl was assassinate this day. And after the investigation, they know that this girl was contacted open Facebook. So we must trust people that we are sure that they ‑‑ they do not work for a job. They do not work for a boss in the information, for example, for the government, for example, for the private sector, for someone in politics to win popularity to win elections. So we have to be more and more careful on social media, because it's a public ‑‑ everyone is ‑‑ everyone is everyone has a chance to be popular like all the media, all the new media. So we have to be careful. Did ‑‑ we have to be careful to have the sources to have to know the journalists before trusting them. This is my answer.
>> MAN HEI SIU: Thank you Mr. Obed. We can only ask one more question in the audience. It's not specified to anyone, but then how do we define what students as violent material or wrong information? It's always difficult to define such a gray line of freedom of expression versus stopping wrong and hateful information because of views changing as society changes.
Are there any panelists who would like to answer this ‑‑ who would like to answer this question?
>> ALICE ECHTERMANN: Just a very short answer. We think there are certain things that can be answered by objective facts as we heard earlier so there are questions that ‑‑ about opinion. There are questions that cannot be fact checked, but in the end, if somebody quotes a statistic or says something that is it actually quite ‑‑ scientifically quite good investigate. So you can answer these questions. And in the end, that's what I mentioned in my presentation. You always have to make sure that, of course, there's not only two options, false and wrong. There's also, for example, unproven. You have to make transparent if something has no source.
>> JENS KAESSNER: Thank you. I will agree with that, because some of the bad actors actually claim that there's no truth or that you have to question even more or that you have to do your own research. And, in fact, that is very misleading.
What you need to do is also trust sources of truth that you have found. That leads our whole society much farther than if you tried to research everything yourself. I will person alley never find a corona vaccination. I will have to trust the scientists that are specialists in this, and that's the way forward. There is a truth.
>> MAN HEI SIU: Ms. Birarda, do you have anything to add?
>> CARINA BIRARDA: Yes, I will have to make simple compilations. If one is on the street with a person, would you believe on that person that say? I would doubt that. I would doubt and verify the information it seems to me that the same can be transferred to the Internet. Like being on street, from the strangers.
>> MAN HEI SIU: Thank you. Our 90 minutes have come to an end. Thank you so much for the panelists for joining us today, as well as the audience. And in the chat you can see that previously there's a link being sent out which is a feedback form. It would help us if you fill it out after attending this session. Thank you very much for joining and hopefully we'll see you next year in Poland in 2021.
Thank you, guys. Bye.