The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> We all live in a digital world. We all need it to be open and safe. We all want to trust.
>> And to be trusted.
>> We all despise control.
>> And desire freedom.
>> We are all united.
>> PETER KIMPIAN: Good morning. Good afternoon. Good evening. I think we have to start. Welcome to the IGF Open Forum 57, co‑organized by the Council of Europe and UNESCO on the role of regulation in a post‑pandemic context. My name is Peter Kimpian, and I will be the Moderator for this Open Forum.
We have five excellent speakers today who will present their views on the issues at stake. And we will give now the floor to the audience and reply to questions you might put in the chat or ask for the floor during the Q&A.
But what is really at stake here? The COVID‑19 pandemic prompted many governments and most state actors to use new emerging technologies to find solutions to curb propagation of the virus. Several digital tools, as you all may know, have emerged as part of a national effort to limit infections like mobile applications for diagnoses, et cetera. In light of recent developments, these digital tools have focused on two main areas, setting up system enabling to store and provide health‑related data on COVID‑19 and creating information systems for organization and monitoring of vaccination campaigns.
But many questions around these new tools and new systems have arisen, such as the regulation would be enough to prevent harm or unnecessary and unproportional interference of human rights of individuals while deploying those systems. Have the new regulations restricting, limiting human rights, offer better efficiency of new processing technologies will become the norm? Are current enforcement and avenues take up the challenges and ensure human rights can be fully exercised, and if restricted that it complies with international standards? So, these are the questions we have asked ourselves in advance, and the panelists will give their views on that.
And I have the pleasure to give the floor first to Patrick Penninckx from the Council of Europe, who will give his position, his insight on these important questions. Patrick, you have the floor.
>> PATRICK PENNINCKX: Thank you so much, Peter. I hope all is fine. Good morning or good late night for some participants around the globe, I would say. Thank you for inviting me to this panel, and let me just start by saying a few words. Before I start, Peter, I would like to tell the audience, as well ‑‑ and it replies very much to the question of regulation, not only in a post‑pandemic context, but also more globally, the choice of the organization for regulation.
Because last week, just a week ago, the Intergovernmental Expert Committee on Artificial Intelligence adopted an element on possible elements for the framework of artificial intelligence based on the Council of Europe standards on human rights, democracy, and rule of law. You will immediately see that this already answers some of the questions that the audience may have.
We know that the pandemic has been the reason invoked by a number of countries to temporarily suspend some provisions of the European Convention on Human Rights, and hence, the Secretary‑General of the Council of Europe published several guidance notes already in April 2020 and later as well in relation to the global crisis caused by COVID‑19 in order to prevent that extraordinary measures often underpinned by digital technology be used for unlawful purposes and control of the population or specific groups of the population, such as journalists.
In addition to this guidance note, the Secretary‑General also took individual actions in relation to certain Member States that were not in full compliance with the standards established by the European Court of Human Rights or by the over 200 conventions when putting in place the implementing extraordinary measures to curb the propagation of the virus.
The Committee of Ministers has emphasized several times the need to ensure that human rights and fundamental freedoms apply equally offline and online. Therefore, the principles and rules of the Council of Europe Convention for the Protection of Individuals with regard to the automatic or automated processing of personal data, more commonly known to all of you as Convention 108, should apply in an online environment and during crisis as well.
We had learned from previous experiences that during times of crisis, governments sometimes tend to support some measures which are not necessarily in full compliance with their commitments at an international level.
This Convention now, the Convention 108, stretches out already to three regions of the world. It has already 55 parties and more than 30 observers and remains the only legally binding multilateral instrument on the protection of privacy and personal data. Amongst the principles and provisions, the necessity, proportionality, purpose limitation, appropriate legal basis, et cetera, are applicable since the '80s, and some of the new provisions that are to be found in its modernized version, such as increased transparency, meaningful accountability, new generation of data protection rights, enhanced data security, and strong oversight authorities, are becoming increasingly important in the current pandemic context. You can imagine so.
It becomes even more relevant for COVID‑19‑related exception measures, as its Article 11 creates a unique democratic framework which makes all provisions and rights of the Convention fully compatible with other important interests, such as public safety and security, which are also key to safeguard public health.
Let me now say a few words on the Cybercrime Convention, which becomes fundamental in the fight against the dramatic rise of number of cyberattacks and crimes committed in relation with the pandemic. The Budapest Convention on Cybercrime, which provides consistent definitions of conducts to be criminalized, procedural powers for criminal justice authorities, and provisions to effectively cooperate internationally with a specific focus on the conditions and safeguards to be ensured when exercising procedural powers, the Budapest Convention, and namely, recently, a month ago, adopted second additional protocol, represents a global attempt at securing effective cybercrime investigations in full respect of human rights and fundamental freedoms of individuals, and that is a crucial issue.
In the majority of the Member States, but also beyond the borders of the Council of Europe, since we're talking to an international audience, law enforcement authorities are already applying both conventions, Convention 108 and the Budapest Convention, and are reporting better results in preventing, investigating, prosecuting online criminal behavior, such as ransomware, fraud, denial of service attacks, grooming, phishing, ID theft, election interference, which we are seeing in over 70 countries in the world, and misuse and others.
Now, in order to assist the parties to Convention 108 in addressing privacy and data protection issues when setting up and implementing measures in view of the fight against the COVID‑19 pandemic, two joint declarations were made by the Council of Europe, and those joint declarations, whereby the Chair of the Committee of Convention 108 and the Data Protection Commissioner of the organization in the Council of Europe. These declarations recall that general principles and rules of data protection are fully compatible and convergent with other fundamental rights and relevant public interests, such as public health. They stress that it is essential to ensure that data protection frameworks protect individuals and that the necessary privacy and data protection safeguards are to be incorporated in extraordinary schemes that are conceived to protect public health.
A report that we published some time ago on Digital Solutions to fight COVID‑19 ‑‑ it was published in October 2020 ‑‑ deals with the issue of how personal data are processed in the 55 State Parties of Convention 108 in relation to the pandemic. The report highlights commendable practices and also less commendable practices that have been followed by State Parties, such as the use of privacy impact assessment and privacy by design principles in the shaping and implementation of digital solution to support public health measures. And those that needed to be improved or stopped, for example, the mandatory use of contact tracing applications for the whole population are measures taken instead of emergency without any time limits.
The Committee of Convention 108 also issued a statement on COVID‑19 vaccination attestations and data protection in which it acknowledges the usefulness of means such as vaccination passes that are considered or already developed by some states, as well as the legitimacy of people's wish to gain back some of the freedoms that were restricted due to the COVID‑19 pandemic and the needs for the economy.
Difference that health‑related data are sensitive data, I don't need to explain that much more. They require additional guarantees when processed and no discrimination can be justified based on that. It is also suggested that alternatives to the use of such digital tools need to be made available to the population and that the use cannot be made mandatory. Finally, it should be underlined that when setting up databases for the monitoring of the organization of vaccination campaigns, strict respect to data protection principles and rules need to be observed.
I will leave it at that, dear moderator, and I'm, of course, ready to answer further questions afterwards. Thank you.
>> PETER KIMPIAN: Thank you, Patrick. Thank you very much. Now I will invite you to go to another continent, namely in Africa, and to listen to Sanusi Drammeh from Gambia. Sanusi, how the Gambian government was dealing with the crisis? What were the measures that were the most important that they take and how it impacted human rights, namely, the right to privacy and personal data? And which were the measures that you and your organization try to put in place to mitigate the risks for individuals? Thank you so much. The floor is yours.
>> SANUSI DRAMMEH: Thank you very much, Peter. Good morning to my colleagues and good morning to all the other participants in this very important meeting. First of all, when the COVID‑19 struck in 2019, the Government of the Gambia was already in the process of developing what we call ‑‑ well, prior to ‑‑ was already in the process of adopting the Data Protection Policy. However, at the time, there was no data protection law in place, but the policy was being processed for it to be adopted. Policy came through the support of the Council of Europe, which was also in line with some of the provisions of Convention 108 plus. So, that was one area that the government saw the need for us to have a policy, convention policy, that would address this particular issue.
On the other hand, the Government of Gambia resorted to use other friendly ‑‑ other online platforms in order to mitigate the spread of COVID‑19, conduct meetings, especially between public officials. So, we had these platforms that were conference platforms used for meetings, and also, we used it for also coordinating projects and also stating consultation and validation workshops. However, this has led to risks, various risks, because the protected mechanisms were not properly commissioned to somehow mitigate the risks for the use of technology at that stage.
So, Peter, do you have another question or should I continue?
>> PETER KIMPIAN: Yes, please carry on. Hello. Can you hear me?
>> SANUSI DRAMMEH: Yes, now I can hear you now.
>> PETER KIMPIAN: Okay. No, no, I suggest that you go forward, yes.
>> SANUSI DRAMMEH: Okay. (Inaudible) Going forward, has introduced what we call the COVID Testing Platform, whereby individuals who want to get tested, their information, personal information and sensitive information is uploaded on those systems, and it's also used for travel. I think it is very common in many countries.
But now the issue of the exposure of sensitive data, because QR codes are being used in order for authorities to verify the information regarding the negative COVID test or even positive COVID test. So, when we have the use of QR codes, you are challenged, because encryption was not part of the process. So, without encryption of QR codes, any person who has access to the QR code can see personal data, for example, the date of birth of an individual, even the test result itself. These are considered to be personal, except for entities who are authorized to see it.
So, now, as a process, the Government of Gambia wants to move forward and enact what we call the Personal Gambia Data Protection and Privacy Law. This was a bill that was formulated through a project supported by the Council of Europe in 2020, fourth quarter of 2020. As we speak right now, the bill has been submitted to the Ministry of Justice for review. It's been there for a while now, and we are expecting it to come out, hopefully, early next year.
But the issue of contention here is we have, as a government, we have conducted a study within the region and have tried to incorporate some particular practice from countries whereby the data protection authority is being merged with the Right to Information Authority. So, now the balance is here. The Right to Information Authority is to ensure that people have access to data and information for their fundamental human rights. However, what is the balance when you compare this to the protection of people's information and privacy? I know in the Right to Information or Access to Information Act that we have in the Gambia has an exception for individual and personal data, so there could be instances where there could be a conflict if the same entity is used to give some kind of access to information to individuals or entities requesting, and at the same time, trying to protect access to sensitive data of individuals. So, as a result, I have with the Council of Europe see whether there could be support in doing a diagnosis before the Government of Gambia can adopt the Ghanaian model, to see whether our bill, the Data Protection and Privacy Bill, and also the Access to Information law, whether there are clashes, whether there are conflicts which require us to review everything. So, these are some of the issues that are at hand.
So, I think there's a lot of points that I wanted to make sure of. Now, for example, people who are conversed, who are aware about data protection issues, if they know that their information that is being collected by authorities without any form of encryption, i.e., QR code, how do they gain trust of government, that this information will not be put into the wrong hands? And of course, government needs to step up. We need to step up and address this matter with the Minister of Health.
And also, the other issue is, when it comes to passports and identification cards, the data comes from (?) which is also fundamental to human rights. Across the border data flows. We do not have laws that regulate that as I speak. However, the law we expect to be enacted would take care of that. At this stage, we only have a policy that talks briefly on transborder data. And also, the Data Protection law that we have is very minimal. It's not very comprehensive, and it's part of what we call the (inaudible) of 2009. That is law. But it is focusing more on the data that is collected by telecommunication authors, so not looking at certain sectors like the banking sector and other players within the data processing and controlling domain. So, these are some of the issues which we think we need to solve.
As I speak, there is no study, as I speak, not in Africa, to say. African countries might have a study, regarding the impact of participation, the increased participation when it comes to the use of these technologies that have been triggered by COVID, but it is apparent that in the Gambia and for many African countries, that has been the case. Thank you very much.
>> PETER KIMPIAN: Thank you, Sanusi. Thank you. These are very important issues. I invite you to jump to yet another continent, namely, to the U.S. And I invite the most bravest of our speakers, Hannah Mayer, to take the floor. Hannah, is it 3:30 at your place?
>> HANNAH MAYER: It is.
>> PETER KIMPIAN: I believe it's early. Commendable that you joined the panel. Thank you very much for that. We are reading great news from the U.S. in terms of the vaccination of the population, but what about civil liberties, human rights, privacy, data protection? What have you come across during the last two years, I could say? Thank you, Hannah.
>> HANNAH MAYER: Thank you, Peter. And it's a pleasure to join everyone. Good morning, good afternoon, and good evening. My name is Hannah Mayer, and I'm an Attorney Advisor at the U.S. Department of Justice in the Office of Privacy and Civil Liberties. So, my office is responsible for protecting the privacy and civil liberties of individuals through view, oversight, and coordination of the Department's privacy program and for advising department leadership on the privacy implications of legislative and regulatory proposals.
At my office, I also support the duties and responsibilities of the department's Chief Privacy and Civil Liberties Officer, who advises the Attorney General, as well as provide legal advice and guidance to the department and its components on privacy, legal and policy matters. So, I will primarily be speaking to what the Department of Justice has been doing at the federal level within the United States.
And just for some context‑setting, there are over 42 components at the Department of Justice, including the Federal Bureau of Investigation, the Drug Enforcement Administration, as well as our litigating components, such as the Criminal and Civil Division. So, we have over 165,000 employees as well as contractors or task officers, or also visitors that we have. And during the pandemic, DOJ has really taken an advisory role, given the stature of Department of Justice to provide legal guidance and advice to the Executive Branch. And so, our office really was working with the department and taking a lead in advising and navigating the pandemic to be able to approach these issues and to ensure that we are meeting that correct balance between health and safety as well as the protection of personal information.
So, in the United States, we have a different type of legal framework. So, we have a sectoral approach to federal privacy. We also have privacy enshrined in our constitution that is further opined on by our Courts, throughout the federal system as well as at the state level. So, what we really have today, what I want to focus on, is looking at the framework we have on the books and then how we have implemented that at the Department of Justice during COVID‑19.
So, really just starting with our Supreme Court case that is still good precedent, Jacobson V. Massachusetts. It's an interesting case in the United States. It's dealing with a group of anti-vaccinators who were led by a man with the last name of Jacobson. He refused to get a vaccine that was required by a Massachusetts statute. And as a result, he was charged. There was a criminal trial. He then was found guilty. It was appealed from the trial court, and ultimately ended up in our Supreme Court.
The Supreme Court ‑‑ this case was decided in 1905, so over 116 years ago. However, it's still good precedent and really is still the law of the land, so to speak, when the Federal Government is approaching vaccines. Ultimately, the Court found that the statute in question related to health law, and it was exercised and enacted in a reasonable manner, reasonable and proper manner of the police power of the state. In Justice Harlan's opinion, he noted that under certain circumstances, the police power of a state may be exerted in an arbitrary and oppressive manner to justify the courts to prevent wrong oppression. However, this wasn't one of those cases. So, again, getting to that issue of what is the appropriate amount of police power to be able to protect health and safety, versus balancing the individual rights to make their own decisions as well as information.
Ultimately, in that case, Jacobson was ‑‑ the Supreme Court upheld the decision of the trial court, and he had to pay a fine of $5, which was what the statute required. So, that is the precedent we have, in addition to the sectoral laws that we also have within the United States. There are many of them that deal with personal information and really depend on the context in which you're in.
So, looking at the institutional response, specifically at the department, as we kind of went through the pandemic, really, prevaccination, it was a general handling of personal information, such as medical information of employees and contractors. Most individuals were teleworking, so there really wasn't a need for the creation of any type of a system, necessarily, and we weren't conducting contact tracing by QR codes. There were no systems set up, because there really wasn't a need for that. So, there really wasn't any substantial conflict. It was generally just applying what our structure had been.
And then, following the Food and Drug Administration approval of vaccinations in the United States, the Biden Administration issued two executive orders, which in the United States do have the force of law, so, again, adding to our regulatory and legal framework. The first executive order directed collection of data about vaccination status for employees, as well as directed a task force on how to handle information on contractors for the United States Government. And then the second executive order required the vaccination of all federal employees with exceptions for medical and religious accommodations.
Now, in order to collect this type of information, there was a requirement and a need to build out an information system that everyone could actually input this type of information, as well as their vaccination status. This included a photo of the vaccine card. In addition, it also included any requests for these medical religious accommodations. And so, getting to the question of how do you build out a system and do it in a way that is balancing both the individual rights related to individual information and the privacy interests associated, but also with the health and safety of individuals and the general public.
So, for the DOJ and for the federal government in general, we have privacy laws in place that apply to the collection of personal information about individuals already. So that framework was already there. Under the Privacy Act of 1974 in the United States, there are specific legal obligations about how to protect information, as well as the notices that we are obligated to provide to the public about the type of information we're collecting. The DOJ already had a System of Record Notice, had this notice in place for public health emergencies. So, from a compliance perspective, the notice was already provided there.
Also, we, when looking at this system, had to conduct a Privacy Impact Assessment. This is also already an established legal requirement in the United States for the collection of personal information by information technologies. So, our office worked with the department to go through and analyze what type of information was being collected, how it was being protected, what the risks to the information were and how we were mitigating that as a department. There was also the need to conduct forms which also have legal requirements from a privacy perspective that attach, especially for requesting medical and religious accommodations under the First Amendment: Freedom of Religion in the United States. There's special statutory authority required to collect religious information without consent, so also analyzing what the implications there were and how to best protect that information and ensure that we were in line with law.
So, in addition to our compliance obligations that we were working with the department in order to stand up this system, we really were building the system with Privacy by Design principles in mind. So, we were at the table with the developers looking at the type of system and portal that we could develop at the department to ensure that we were building trust, ensuring that individuals had access, could input their information in an accessible way, but then also only collecting information that we needed, not collecting more than was authorized, and really applying the security controls, as well as the privacy controls from the ground up. In addition, also considering the type of information that we were collecting, for example, the religious and medical accommodations, and working towards building a system that really was baking in privacy from the beginning. And ultimately, we found that this was a very successful process.
And at the department, we have over 99% of the department attested to the requirements of the executive orders, which implies just by numbers that, you know, there was trust built into that system, individuals were providing their information, and ultimately, we have been able to use that information to show that over 90% of full‑time employees have been vaccinated and also break down the further percentages of the accommodations, just so that we can meet our regulatory obligations, but then also, you know, ensure that we are building systems that promote trust and that also are protected from a security and privacy perspective as well.
So, I will end there. Thank you, Peter.
>> PETER KIMPIAN: Thank you so much. That's very thorough and complex, I must say, but very interesting. Thank you, Hannah. And as my little daughter's saying, once we start jumping, let's never stop. So, let's jump to another continent, Soenke. You are based in new Delhi, but you also work a lot with UN agencies as a consultant. If I understand correctly, you are an intersection of open data, protection of individuals, how to best use data. What have been your experiences? What have you seen during the pandemic, how governments or international organizations can use the researches that you've been carrying out? Thank you, Soenke. This is your turn.
>> SOENKE ZIESCHE: Thank you very much for the introduction and the invitation, Peter. Good morning, good afternoon, and good evening to all the participants. Yes, it's correct, I'm a consultant for UNESCO, and recently I have drafted guidelines for the Member States on open data, and in particular, on open data for AI systems. So, open data are data that can be freely used, modified, and shared by anyone for any purpose, including by AI systems. In other words, the data should be findable, accessible, interoperable and usable. These are the so‑called fair principles. And this is motivated by the fact that open data are considered as a prerequisite for informed plans, decisions, and interventions, especially related to the SDGs, but now, most recently, also related to COVID‑19. So, I thought it would be a good idea to make a case study to what extent open data were used during the pandemic. So, I'm talking about the outcome of this case study now.
To tackle the COVID‑19 pandemic from the onset, timely, relevant, and quality data were essential, but it became also clear that in addition to these features, also openness of data was critical. And indeed, a large amount of open data related to COVID‑19 were shared and had significant impact. And then again, a distinction has to be made whether the data are harnessed by humans or to fuel AI systems. Just a few examples. Already in January 2022, scientists had issued a letter and called for open sharing of the COVID‑19 data. Then in March 2020, the UNESCO and Director General advocated for scientific cooperation and the integration of open science in their research programs. And in the following month, April 2020, the Open COVID platform was launched, causing organizations around the world to make the patents and copyrights freely available, and this received many signatories, including from major corporations. Also, the private sector was involved.
And indeed then, what followed in the following month and almost now two years, open data as well as AI systems were used successfully for diagnosis, containment, and monitoring rapid vaccine development and treatments, as well as forecasting during the pandemic. Just one example, but out of many particularly successful data‑driven AI model has been developed for the city of Valencia in Spain by a team of researchers. This model predicts COVID‑19 infection rates and prescribes non‑pharmaceutical intervention plans and contributes to more evidence‑based policy‑making. But this is just one example. There have been many examples around the globe where open data were applied successfully during the pandemic.
But as you can imagine, despite the large number of open data initiatives related to COVID‑19 and also their timeliness, there were remaining challenges. There were overall significant data gaps, especially in developing countries and among at‑risk populations. Also, there were issues with data of poor quality and lacking data disaggregation, especially disaggregation by sex, which would be very much desirable that this would be fixed. And the reasons for these challenges, among other things, there are funding gaps, especially in low‑income and low and middle‑income countries. These countries were facing significant financial challenges related to coming up with COVID‑19 data and statistics. And also, overall, the data which were used during the pandemic often did not fulfill the fair principles which I introduced before, so they weren't entirely findable, accessible, interoperable and reusable.
And another challenge, and this is also very much related to this panel, is that open data, to keep the balance between open COVID‑19 data and the right to privacy, for instance, there have been instances of leaked data of COVID‑19 patients which was an issue. Another issue is that, especially during this pandemic, but also the phenomenon came up before, that the pandemic is accompanied by disinformation and misinformation, and this has been called by UNESCO as disinfodemic, so this is disinformation that is ongoing, unfortunately.
So, overall, the COVID‑19 pandemic has brought the world together to address this global challenge. So, there were significant (?) made owing to timely open data exchange and collaboration and forecasting diagnostic tools as well as providing vaccines provided in record time. However, it has to be also noted that most of these open data initiatives were ad hoc and not very well coordinated because the world was obviously not well prepared for the pandemic. So, as a lesson learned, there have to be regulatory frameworks, and data governance models should be developed, supported by sufficient infrastructure, human resources, funding, and institutional capabilities to address the challenges related to open data and to be better prepared for similar situation in the future.
And again, related to the AI, also the relationship between open data and AI needs to be further specified, including what features are required so that data are totally AI‑ready.
Very briefly, I will talk about now also in general, not related now to the pandemic, but in general, about issues of open data. So, what are the pros and cons? There are quite a few pros related to open data. It is argued in favor of open data that there cannot be a copyright actually on fractured data, anyway, and that everyone should have the right to access data. It is considered as a feature of democracy that activities of governments a transparent, full, open data, and there have been many success stories in previous results before the pandemic. And in addition to accessing the data, the opportunity to reuse, rearrange, and combine them, and potentially gain new scientific insights from them enables actually citizens' engagement, which is important and also creates new innovative services and products so open data could increase social and could lead to social and commercial value.
And, obviously, there are opponents of open data or there are arguments against it. And one point is, again, that the data may violate the privacy of concerned individuals whose right it is to control what is actively or passively collected about them and what is disclosed about those individuals. And then also, in many cases, the collection, cleaning, and dissemination of data is both labor and cost‑intensive, and this may deserve financial compensation, but this financial compensation would be omitted if the data openly shared, some work and cost had been invested in the data before.
Another point is that data can be misused with malicious intentions, and there again, AI would be one issue that AI systems may misuse them because AI systems are (?) use and regard the system of data hazard.
So, in summary, the protection of individuals' personal data, especially at times of crisis, is a significant concern. Many valuable government data are about citizens, so, therefore, the balance between releasing this data and the privacy of the citizens is an ongoing debate, and especially related to AI, because if AI systems are considered, there are further challenges, such as bias and discrimination, but also potential malicious acts. So, yeah, in a nutshell, there have been quite a few benefits by opening the data, and I talked about the success stories, but we must not forget or keep in mind the challenges especially related to privacy. Thank you so much. Over to you, Peter.
>> PETER KIMPIAN: Thank you. Thank you, Soenke. It is very, very interesting, indeed. Turning to Rachel. Hello, Rachel. I'm very happy that you are here with us. You are representing UNESCO. And you told me that UNESCO put out a few propositions for using guidelines for traditional actors, how freedom of expression has to be respected, but they also promoted a number of safeguards in relation to surveillance measures. Could you give us a more in‑depth view and explanation on those measures? Thank you, Rachel.
>> RACHEL POLLACK: Yeah. Thank you so much, Peter. So, my name is Rachel Pollack. I work in the section for Freedom of Expression Safety of Journalists at UNESCO. We have a mandate to promote freedom of expression and its corollaries of access to information and press freedom. So, I'll be speaking a little bit about the impact of COVID‑19 on these issues. We've heard a lot about privacy today, but there have also been major impacts on freedom of expression. I'd like to also thank Peter for inviting us to co‑organize this Open Forum. We've been strong partners for many years and are really happy to join in this effort.
I think when we proposed this session, it was a little bit too optimistic, because it was on regulation in the post‑COVID world. And I think, unfortunately, we see that we're not yet there, and I hope that next year we will be able to meet in person again, and it's really nice to see a lot of familiar faces in the Zoom chat.
So, just to give a little bit of context on the impact of COVID on freedom of expression. Many governments declared states of emergency, particularly last year. Some of them were extended for quite long periods. That had impacts on fundamental rights, including freedom of expression. To give some numbers. According to the International Press Institute, between February 2020 and May 2021, journalists covering COVID‑19 across the world were affected by more than 100 restrictions on access to information, 215 arrests or charges, 95 cases of censorship, and 238 verbal or physical attacks. The restrictions on freedom of movement also hindered the work of journalists. The former UN Special Rapporteur, David Kay, had a report in April 2020, in which he noted that while temporary constraints on freedom of movement are essential to beating COVID‑19, they must never be used as a pretext for cracking down on journalists' ability to do their work.
We also saw during this period an increase in laws meant to fight disinformation or so‑called fake news. As our last speaker mentioned, there was this wave of disinformation/misinformation that UNESCO has called the disinfodemic, like the infodemic or pandemic, with UNESCO, it's disinfodemic. But in response to that, we saw regulatory measures that infringe on, that were overly broad, did not have the kind of necessary limits for restrictions on freedom of expression, and we observed that in the last few years there have been at least 57 laws across 44 countries that have been adopted that risk infringing freedom of expression online. And this data will be included in a forthcoming publication we have on World Trends in Freedom of Expression and Media Development.
So, in this context, and in response to the growing legal challenges, UNESCO has issued a number of guidelines and resources. These include guidelines for judges and other judicial actors, national and regional level. There is a set of guidelines specifically within the context of emergencies and COVID‑19 that was developed together with the University of Oxford and a series of massive, open, online courses. We worked with Mila, which is the Quebec Institute for Artificial Intelligence, to develop and deploy an open-source peer‑to‑peer solution to contact tracing apps, which we've heard about this morning.
We also have issued some publications, policy briefs. We had one early in the pandemic called "Journalism, Press Freedom, and COVID‑19," that outlines some of the challenges to media freedom and journalism. And we also put out a brief on the Right to Information in Times of Crisis. And that has a number of recommendations, including that states have a positive obligation to disclose on a proactive basis emergency‑related health, budgetary policy‑making and procurement information.
We also saw, and this brief talks about, that there are some logistical barriers in processing requests to information, but the States do have a duty to provide this information on it in a timely basis.
I think we ‑‑ again, there was a lot of discussion today about contact tracing, but I think we also have to keep in mind that the measures instituted during COVID are not unique to the public health context, and in fact, all of the devices that we use, especially from the social media platforms and others also engage in this kind of surveillance, and what has been dubbed a surveillance capitalism, for example, that has a major impact, of course, on people's right to privacy, but also on freedom of expression.
The issue of surveillance is something that is increasingly on UNESCO's radar. This will be the topic of World Press Freedom Day 2022, which will take place in Uruguay, and under the theme "Journalism Under Surveillance." And we actually have just released yesterday a call for proposals to take part in that conference, which I encourage you to take a look at.
And I'd like to end just mentioning, I think for all of these measures, for protecting freedom of expression, having better access to information in the context of COVID, and more generally, UNESCO advocates strongly for greater transparency related to digital technologies, and especially to the operations of the big Internet companies.
We put out a brief last spring called "Letting the Sunshine In: Transparency and Accountability in the Digital Age," which includes 26 high‑level principles on transparency, including several related to data protection and privacy. So, I think we have to be very vigilant about that and the practices that have been put in place during COVID and afterwards, but again, thinking about that in a larger context. Thank you very much.
>> PETER KIMPIAN: Thank you, Rachel. That's perfect. And thank you very much. Privacy and freedom of expression are sister and brother, right? So, one cannot exist without the other. So, it's sort of pertinent as well. Thank you very much. We have seven minutes left. I haven't seen any questions on the chat, but if you have any, this is really the time.
But while waiting for the audience, I will turn to the panelists with one quick ‑‑ one question, requesting you to kindly answer this in one minute. So, I'm turning to Sanusi first because you mentioned the role of data protection law, which could play, and how strategic it is in your country determination. So, I would like to ask you the following, because during the pandemic, personal data have been exposed in many countries. But how do you think that an enforceable legal framework, including a comprehensive data protection could have prevented data breaches or protected individuals? One minute, Sanusi, if you don't mind.
>> SANUSI DRAMMEH: Yes, that is a very important question. And first, how it will create accountability, because data protection law/bill has provisions for accountability to hold data processors and controllers to account in order to ensure that they put certain safeguards in place. So, that will really help a lot. And also, Human Rights by Design. Let's say applications or information systems or tools or mechanisms that data processors and operators use. They have to make sure that it meets certain requirements. So, the law would cater in from that and the regulatory enforcement would ensure compliance.
>> PETER KIMPIAN: Thank you. Thank you very much. 45 seconds. Very good. Thank you so much. I'm turning back to Rachel. Rachel, we talked about that this is the first time that in the area of digitalization that we have experienced such a pandemic with such a dimension. What do you think have been the main mistakes made by government when taking temporary regulatory measures, or have there been any? What could have been done differently to guarantee not only public health, but at the same time, to guarantee that only the necessary minimum interference with human rights is done and to guarantee the protection of individuals with respect to the right to privacy and freedom of expression? Rachel?
>> RACHEL POLLACK: Yeah, thank you, Peter. This is not an easy question or an easy three questions, I would say, to answer. I think, you know, there was a lot of discussion about how to develop technologies that were the least invasive possible from the perspective of privacy. So, for example, using Bluetooth and the application that UNESCO developed with Mila in Quebec I think is an example of that. So, there are ways to bake it in already at the design stage, Privacy by Design.
I think, you know, just to go back to my remarks earlier, I think mistakes could be seen as using the pandemic as an excuse to institute restrictions that were not necessary. And one could think very cynically that it was just opportunistic beyond what was really necessary, and I think we saw that especially in crackdowns on freedom of expression on journalists.
I think the issues related to access to information and what information commissions were able to provide, in many cases, they were simply overwhelmed. So, I don't think there was bad will then, but it shows that in the future, more capacity is needed. I think we have another speaker. I think that means it's time for someone else to come in on the question. So, we only have three minutes left. But anyway, very interesting questions and look forward to hearing what others have to say in the last two minutes. Thanks.
>> PETER KIMPIAN: Yes, thank you so much, Rachel. Turning to Soenke. Soenke, you spoke about the engagement with multinational private companies. Can you comment on concrete examples, whether either public or private organization has decided to apply open data and what has been its effect on human rights, I mean, more concretely, and in one minute, if you don't mind? Thank you so much.
>> SOENKE ZIESCHE: Thank you so much. And yes, there were quite a few examples. But then again, many of the relevant data held by corporations were, in this case, when it comes to COVID‑19, not related to human rights, because the corporations, they had also data about the development of pharmaceuticals of medication of vaccines. So, these data, of course, this was very much desirable to be shared. And there were cases when it happened. And this is not necessarily related to human rights.
When it comes to human rights, of course, there is related data when the corporations did any tests with humans, when they tested the vaccines with humans, there the balance was more relevant, but I think the data I talked first about, they were more critical in the beginning, and it was desirable that they were shared. And there were a few instances, and I think in this case, there were no issues to human rights, so this was appreciated when there were shares and there were examples. Thank you.
>> PETER KIMPIAN: Thank you so much. Hannah, you mentioned in your speech that regulating COVID and privacy‑related measures, but also more broadly, civil liberties and human rights must rely on trust. How do you think authorities can build trust among citizens, how citizens can be confident that their personal data will be protected after all? Thank you.
>> HANNAH MAYER: I think this is the underlying issue of privacy always, so not just in the pandemic context. So, really ensuring that governments are doing what they say they're doing and also understanding how systems are operating. And so, really, I do think understanding technology as a privacy professional, as a privacy attorney, is understanding technology, understanding data, and understanding how to balance both your mission, but also to protect the individual interests in that information. And so, you know, ensuring that there aren't just the minimization of collection or just limiting to the collection of information for the purposes for which you need them, but then also on the security side, protecting them. So, I do think the relationship between privacy and security is definitely the crux of where the trust underlies, and it's where we see the data breaches. Those are really where there's a lot of mistrust and distrust that is usually where the public starts asking questions. So, I think it's working as a team to build privacy in with the technologists is absolutely key to continue to build trust with systems, especially with sensitive information. Thank you.
>> PETER KIMPIAN: Thank you. Thank you so much. I really like that. Turning to Patrick. Patrick, as the last speaker, in your speech, it talked about measures about Council of Europe put in place. But are really people there cautious about what temporary measures mean for the human rights, specifically, private life and data protection? What could be done by an organization, such as the Council of Europe, to raise consciousness about this subject and avoiding disinformation of citizens? Patrick, in one minute, if you can. Thank you.
>> PATRICK PENNINCKX: Thank you, Peter. I think rather than trying to stumble over my words in those 30 seconds that I've got left, I think it's more important for me to thank all participants on behalf of the Council of Europe for having accepted to take part in this wonderful panel. Peter, Sanusi, Hannah, Soenke, Rachel, thank you for waking up in the middle of the night just to speak to us. My full admiration to you and my full admiration to the colleagues from all over the world for having participated in this panel.
And I think, Peter, I would need to send the question back to you and to all of the participants. Yes, in theory, we can do things, but it's also, how do we as individuals react when, for example, you're invited to access an application. The question is then, will you do it or not? And I think it's also a question of what is the utility for yourself? And then sometimes, we put our principles to the side, and even if we're reluctant to do so, we will still do so. So, thank you very much for this.
>> PETER KIMPIAN: Thank you. Thank you so much, Patrick. Thank you to all of you. I will abuse my role of moderator a little bit and reply to your question, Patrick, in 30 seconds. I think to raise awareness, IGF is a wonderful initiative. I think we have contributed to the awareness‑raising of the issues at stake. You have been wonderful, really good. Thank you very much for all your comments, input. I think there is a lot of information here, a lot of food for thought. Please carry on, share information, work together, build trust, and safe guard human rights, rule of law, and democracy. Thank you so much and have a good day. Have a good sleep.