The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> We all live in a digital world. We all need it to be open and safe. We all want to trust.
>> And to be trusted.
>> We all despise control.
>> And desire freedom.
>> We are all united.
>> They say we despise control? What is the meaning of despising control?
>> LORI SCHULMAN: Thank you, perhaps we can address that in the roundtable. I think that is an element of human rights. Right? That is our entry cue, of course. I want to say hello to everyone. My name is Lori Schulman. And I'm welcoming you to A Common Bill of Digital Human Rights and Responsibilities. Many of you know me as Senior Director of Internet Policy for Global Trademark Association. Today I'm acting in my personal capacity with regard to a subject I care deeply about. The thoughts expressed are my own as role as moderator and do not represent my employer.
This session will explore the role and function of a digital Bill of Rights, which is guided by fundamental universal human values that transcend our differences. Common principles and values are typically expressed as charters or Bill of Rights. And provide the framework for emerging governance and ‑‑ excuse me ‑‑ and provide the framework for emerging governance mechanisms.
At the current stage of Internet governance, this process is in its infancy and arguably it hasn't started at all. The rights of engagement and the rights and responsibilities of our digital residency as digital citizens remain unclear.
This session describes the role and function of the digital Bill of Rights and looks at the role of digital rights in the past, the present, and toward the future. We will set the stage with foundational constructs presented by Professor Klaus Stoll and Professor Lanfranco. Then we'll move on to a roundtable of experts that include Bruna Martins dos Santos, Brian Beckham, and Stacey King.
Finally, Sam Lanfranco will sum up with a vision of building a bill for digital rights and vision of the path forward.
Participants will be invited to join a follow‑up event held as an open circle ID forum for questions, answers, and discussions around digital rights and strategies. I will drop a link into the chat that will lead you to the post session private discussion. Open to the public but off the IGF site.
We are looking forward to the post‑session exchange. We're looking forward to the next 60 minutes. I will introduce our first two speakers. Klaus Stoll is the co‑founder and CEO of the Internet Integrity Taskforce, IITF. Klaus has over 30 years of practical experience in Internet governance and implementing ICTs for development and capacity building globally. He is a regular organizer and speaker at events. Adviser to private, governmental, and Civil Society Organisations. A lecturer. A blogger. And an author. Along with his IITF responsibilities, Klaus provides consultancy services in the area of Internet policy and strategic partnerships.
Sam Lanfranco is Professor Emeritus and Senior Scholar at York University. Sam is a development economist working on issues of innovation and sustainability in the Internet ecosystem. Working with regard to the advancement of science in Africa. Working in India. Working on global digital citizenship and engaged digital citizenship and digital integrity within the Internet ecosystem. And I'm going to hand off to Klaus.
>> KLAUS STOLL: Thank you very much, Lori. Thank you very much online. And especially those in Katowice at the end of a very long IGF. I just switch over to share screen to move into the PowerPoint. And, Lori, could you let me know if the screen is visible?
>> LORI SCHULMAN: Yes. We can see it.
>> KLAUS STOLL: Okay. That is wonderful. Let's simply talk about the basics of the bill of digital rights and responsibilities. I think everybody's talking about bills of rights. But I think responsibilities come with rights.
And the problem is ‑‑ it's not a problem. It's a reality that our lives today are basically a duality between the real world and virtual cyberspace. And as such, we have not only citizens of our own countries but we also have become citizens of cyberspace.
That, of course, creates tensions. And the increasing impact of digital technologies, we also are torn by the lack of control over them. The result, the reason for this, is basically the sovereignty ends at the ‑‑ all political constructs like governments where you could say, okay, here are my country's stats. This is the law of the land. And this is how it works. Has been ‑‑ it's no longer valid in cyberspace. And as a result, digital governance is more dominated by special interests. And not common good.
It's basically who can exert the most power. We are moving back basically in pre‑democracy times.
That effective governance, national, global, has a very profound effect on ourselves. Because we don't know as persons, as people, how do we exercise our rights and responsibilities in cyberspace.
So the missing link basically, which the whole experience in history has shown us, that the missing link is basically a construct like a digital Bill of Rights. As we cannot and we should not answer immediately all the questions regarding governance in cyberspace, because this is subject to a lot of dialogue. A lot of exchange. And things we have to discuss and not a single group or organisation can decide on. We can say with certainty that we need a Bill of Rights. And that this has to protect us against governmental private sector (?) as we already experience in cyberspace today.
So the role and the fundamental function is a Bill of Rights is to remind ourselves we're empowered to pay attention to the fundamental rights of those who are subject to their powers.
So the key question is now what core values should inform that digital Bill of Rights? It's nice to say, okay, we need a digital Bill of Rights. But what are the values? Gratefully, we had after the two great World Wars that effort to create a Universal Declaration of Human Rights. And this is a document which basically expresses the fundamental values which are common to us all.
And it has certainly ‑‑ it creates trust and because where human rights are respected, trust exists. It basically provides a constitution for our global living together. And it is also a very important document of the economic side. Because without fundamental trust of ‑‑ fundamental human rights, it's no economic development and sustainability.
So, for example, also for the context of the UN, Sustainable Development Goals and the UDHR are so closely connected that we really have to see them as two sides of the same coin.
So what happens when we extend and apply our fundamental human rights into cyberspace? So what's actually the right thing? We can restore trust. Trust can be not created through engineering. Engineering can give us wonderful electronic tools. But it can create trust, itself. So when people say, okay, let's engineer a system like blockchain, which basically is absolutely safe and foolproof, that is wonderful. That's important. But that doesn't generate trust. The bill will shift the viewpoint of the whole discussion. And the current Internet discussion. Away from this limited interest groups, limited interested to a common human rights interest. So trust becomes a foundation of economic, social stability, and development in cyberspace.
The other example is quite simply that we know who we are again. And that our roles are defined. Digital users are able to exercise their rights and responsibilities in cyberspace. Governments will be able to accept that they have to share sovereignty in cyberspace. Governments will be able to expect that the authority ends at their border but they can extend through cooperation international treaties and so on based on the UDHR or Bill of Rights, their sovereignty.
And the private sector will be able to accept that innovation comes with responsibility. Responsibilities become a source of economic prosperity in a new digital economy of trust. I think this is also a very important sector because without underpinning, none of the governance mechanisms will work.
Another example is quite simply AI. We're frightened that machines will become better people. But the real issue is not engineering one. The real issue is not even an artificial intelligence one. The real issue is how to build algorithmic routines that observe codified rights and observe a digital Bill of Rights. The point is that everybody thinks that algorithms are neutral. No. Every algorithm is biased. Because algorithms are human made. If we're looking at AI, we have to see how we can actually build digital rights and values in.
The same thing in the example with fake news and with freedom of speech. The point is quite simply that it gives us a framework in which we can apply our rights and our responsibilities. Fake news and manipulation can actually be checked and can be evaluated. So the outcomes is quite simply if we have a Bill of Rights, it would enable a period of healing and restoring trust in cyberspace. We won't get there overnight. It will take time. To transform the Bill of Rights into experienced reality requires ongoing awareness and capacity building, stakeholder engagement, integrity‑based business models, and legitimate digital government structures. Thank you very much. Sorry if I overspoke too long. I'm open and ready for the open exchange question & answer session. Thank you very much.
>> LORI SCHULMAN: Thank you, Klaus. You're right on time. I'm literally keeping a timer given we have one hour. I'm going to quickly put the spotlight on Sam. Sam Lanfranco, please.
>> SAM LANFRANCO: Thank you, Lori. Let me get my share screen going. Klaus, you have to unshare first. Okay. Okay. Thank you. Just a second.
>> LORI SCHULMAN: While Sam's doing that, I want to note I dropped into the chat some links. Links to the post‑session discussion. A link to a YouTube recording of a pre‑event. And Carlos Alberto dropped in an article drafted by Klaus and Sam. Sam, take it away.
>> SAM LANFRANCO: Thank you, Lori. My presentation is how do we get to where we want to go? My preface is we're going to do in virtual space what it took several millenniums for humans to do in literal space. We're trying to build these structures that we spent hundreds of years building. I'm going to talk about how we get there hopefully a little faster. What are today's challenges? We have weak protection for digital rights and very poor understanding and notion of responsibilities. There's low integrity in our governance, business, and societal entities. Low trust. Our goal, of course, is wellbeing and dignity for all of us in a free and open society. I'm reminded of Yuval Noah Harari and his book on “Sapiens.” We are what we are and need to improve.
What do we need to understand as Klaus pointed out? Our virtual literal combined to instruct our reality. We have social, religious, psychic, and digital spaces we live in. Good outcomes require integrity and trust. That requires digital citizenship.
A very important point to understand is that responsibilities are not about subservience to government. They're about mutual respect for the rights of others.
A quick history lesson. We knew what we should do after World War I after that disastrous war. The French and British competed to shape German reparations. After World War II, we did all those betters. The cost was the Great Depression, Holocaust, and World War II. Without understanding, what efforts we do are flawed and our will tends to be weak.
Today's Internet ecosystem, governments defend and abuse freedoms. We have a mix of data privacy, limited data privacy, the EU, the GDPR. We have digital surveillance. We have a suspension of expression and assembly. India closes down the Internet in Kashmir for half a year.
>> LORI SCHULMAN: I'm sorry to interrupt you. Your slides aren't advancing. If you'd like, I'm happy to share your slides on my screen and advance them for you or ask you to please advance them.
>> SAM LANFRANCO: Okay. I'm ‑‑ let me back up and ‑‑ I don't know why they're not advancing.
>> LORI SCHULMAN: They're definitely not. So you can keep going. But we just want to ‑‑
>> SAM LANFRANCO: Okay. See if you can catch up. Okay. I will present as though there are no screens.
>> LORI SCHULMAN: All right. Then please stop your share screen and I'll attempt to get us where we need to be for you. Okay? Just hit stop.
>> SAM LANFRANCO: I'm sorry.
>> LORI SCHULMAN: Sorry to knock you off the pace. It was mentioned in the chat a couple times. People are interested in the slides. I'll share them. I'll track you. Okay? So you follow along and I'll catch up.
>> SAM LANFRANCO: Okay. Just let me go to my ‑‑ So where are we today? Today we have governments that defend in small and abuse large. We also have what I call in the business sector what I call digital Colonialism. The FANG stocks act like the Belgian King in the Belgian Congo. I'm on 5 now.
>> LORI SCHULMAN: I will get us there.
>> SAM LANFRANCO: Okay. There's a digital Colonialism taking place in part of the large businesses. We are basically in digital servitude. We serve up our data in exchange for some social media access. And being peppered with ads to buy things. The metaverse will be more complicated because we'll have a persona or avatar that belongs to them or controlled by them. And is used to exploit us. At the personal level, we're in digital servitude and trying to acquire digital rights. This is where we're trying to go. Digital servitude and digital exploitation today. Digital citizenship and democracy tomorrow.
Next slide. Okay. Future digital ‑‑
>> LORI SCHULMAN: No worries.
>> SAM LANFRANCO: All right. Future digital rights and responsibilities. Think of the principles as navigational aids. We need a global digital charter. And we need national bills. The global digital charter is like the Universal Declaration of Human Rights. It's the agreement between nation states. And the national bills are how digital citizenship operates in each country. As we know, digital citizenship varies widely and evolves over time. So we have to have ‑‑ the basic principles we need to develop, we need a global charter as a multilateral, multistakeholder project. We need national bills as national multistakeholder projects. The importance is involvement in the processes.
The goals attached to the path going forward. The citizenship. Integrity. Building trust. And imparting integrity and trust in society's digital, social, fabric and its social contract. And addressing systemic issues through principles, not just symptomatic problems one at a time. Next. I don't have to say next because ‑‑ okay, Lori.
>> LORI SCHULMAN: Yeah. It's not forwarding. It's not forwarding. So just go ahead, Sam. I'll work on it.
>> SAM LANFRANCO: All right. So what is the work to be done? We can strengthen stakeholder education. Education is really important. For a social contract, for a social fabric, we all have to be more or less on the same page. We don't have to agree. We have to agree what the areas of discussion are. We aren't there yet. We have to address the weaknesses. We have to understand that digital citizenship is a right. It may vary by country and location. It is a right. We have to shift to more systemic and less symptomatic approaches to things and have to switch to two things ignored in the last little while. We have to focus on the notion of responsibilities. If you look at the pandemic, people are declaring their rights in capital letters and talking about responsibilities as an invasion of their privacy by government. We have to see responsibilities as mutual. Not imposed by government. We have to extend the policy focus beyond government. Businesses have to have policies. Communities have to have policies. Citizens have to have policies.
Moving forward, this is important. I hope you can see the slide. We need more cooks in the kitchen. Global charter, national bills, are not like finding a vaccine or baking a cake. We have to build education and consensus. We have to understand digital citizenship. We have to understand multisector governance. We have to understand rights and responsibilities. We have to understand the Internet as an ecosystem. Not a vacation site. We need broad stakeholder engagement and dialogue. We need a multitude of campfires. Literal and virtual working groups. Symptomatic problem working groups. Systemic issue working groups. Multistakeholder working groups. We need a massive dialogue to build a global charter of digital rights and responsibilities. And national bills to enshrine digital citizenship at the national level. That's it. Sorry for the mess.
>> LORI SCHULMAN: Oh, it was not a mess, Sam. There was a good message here. These are just slides. It's the ideas that count. Not the slides. I want to thank you very much.
There this part of the session, we're going to go to a roundtable discussion. I'm going to ask each of the roundtable participants to react to what they've heard from Sam and Klaus. We'll each make a statement for four to five minutes. If you need less, that's fine. That means there's more time for discussion. After the three statements, get into the discussion and wrap up in about 30 minutes.
Our three roundtable participants come from all different parts of the Internet ecosystem. I'm pleased to introduce Bruna Martins dos Santos, the German Chancellor at the Alexander von Humboldt organisation. She holds a Bachelor's of Law from Brasil University. Personal data protection and human rights in the digital age.
Brian Beckham, at the World Intellectual Property Organisation, known as WIPO. He's responsible for the day‑to‑day management and oversight of all WIPO domain name operations including human resources, administration, finance, and Internet technology. And related IP and DNS policy activities.
Prior to joining WIPO, Brian was employed in a legal private practice representing clients in trademark, telecommunications, and non‑profit matters. He provided strategic advice and legal implementation in relation to new GTLD applications. He's deeply engaged in the policy development process at ICANN.
From Brian, it's my pleasure to introduce Stacey King, Policy Fellow at Oxford Internet Institute. Stacey's research is focused on the interplay between national and international regulatory structures and policies and emerging development and use of artificial intelligence. Particularly, as they relate to intellectual property, public domain, commons, and data. Stacey works for a U.S.‑based tech company and previously served as Senior Digital and IP Counsel in London for a European luxury goods company. And has worked as several U.S. law firms. She's a member of the Advisory Board of the UK‑based Centre for Democracy and Peace Building. Stacey is participating on the panel in her personal capacity and is not representing the views of any employers, past or present.
I'm going to start with Bruna. Please share your reaction to what you heard from Klaus and Sam.
>> BRUNA SANTOS: Hi, Lori. Hi, everyone, in the room. We have a group here as well just so you know. Just about 10 to 20 people. 20 people I guess. So, hello. Thanks again for the introduction, Lori.
And starting with some reactions about the presentations before. I would say that maybe from a very high‑level position, you could say that citizens don't fully understand their digital rights. But I would also say that when we live in a world in which governments continue to pose threats to whatever rights we have, it's demeaning to our understanding of what they can be. What do we need. And what, like, what rights do we, in fact, have.
So just bringing this to start speaking about Brasil, because despite the moment that we're living, like, with continuous institutional governmental threats and attacks to the current states of digital and human rights in the country, it's kind of interesting to see how some things still come up to, like, to our mind and to society as well. Like, we, Brasil, luckily enough has created a good tradition in multistakeholder participation. And I guess processes such as the Internet Steering Committee. The set of principles for the Internet governance in Brasil. Also the construction of legislation such as the Data Protection Bill. Kind of the two main processes that allowed everyone to be at the table with the same level of parity to legislators, to the private sector, and anyone else that just wanted to participate. In this process, the compromise of the main actors that were behind them was kind of the key part of it all. We were lucky enough to have (?) at the time that made sure that everybody was going to have something to say.
And throughout this process, I guess that Civil Society and many of the sectors in Brasil, they got to be educated in this. And what is multistakeholder participation. What is, like, law making processes and whatever. And what are the stakes and how can we better negotiate and also, like, talk to more people at the same time.
So that is one thing. Like, we need to continue and that's pretty much an issue for everyone that's here. We need to continue including more and more people. Not just the policy wonks. Not just us. Including whoever else wants to join the Internet governance and regulation conversation.
And just to wrap up this initial points, we have also been seeing, like, a list ‑‑ when we start to speak of common set of principles or a bill of digital human rights and responsibility, we need to discuss who will be the ones responsible for implementing that. We have seen in the past year some very interesting initiatives such as the Internet Rights and Principles Charter, the contract for the web and the two, like ‑‑ the most recent initiatives from the UNSG, the Digital Roadmap for Digital Cooperation and Global Digital Compact which was just launched.
But if we are talking about in a space that's very keen on multistakeholderism, but if we are discussing a set of principles that's going to be negotiated, applied, and deployed in a multilateral or even, like, top‑level way, that's kind of not necessarily the way forward.
Just to point out, I would point out to me, it's also important in this conversation that we try to understand who are we aiming these initiatives at. Who are we talking to? How can we better engage other sectors and people into the deployment of all of these ideas and principles and sets of rights. Because some of them might have not worked. Some of them are still in the making. And I'm not yet sure. I'm being very honest. Like, if we need a new set of principles or new set of rights or just to reemphasize them, if governments would just continue to refuse to uphold human rights and digital rights all over the world.
So I guess I'll stop here because I might have gone over time. So thank you, Lori.
>> LORI SCHULMAN: Thank you very much, Bruna. No, you were right on time. No worries. Next, Brian. Brian Beckham.
>> BRIAN BECKHAM: Good morning. Thanks for the invitation. I should start by giving a slightly boring caveat that I'm speaking in a personal capacity.
I would start by saying I completely agree with what's been said before. And so I would sort of ‑‑ what I wanted to focus on was let's say we've kind of accepted universally a certain set of propositions in terms of, you know, rights and responsibilities online. And I'm already looking ahead sort of at next steps. If you look, for example, if you look at the UDHR, you have concepts of security and order and, you know, rights against infringement and free participation in society and life. And so I think a lot of the focus has been so far on what's done with one's data online. What's done with your data by intermediaries. By platforms. Et cetera. And that's a very important part of the conversation. And it presents a really interesting dilemma in terms of privacy questions.
And one of the things that I've noticed especially this holiday season shopping online is that there's a bit of a free‑for‑all going on online. It's really difficult to know with whom you're transacting business. And one of the things that I've seen emerging is a sort of a desire from legislators and policymakers to address that. In the USPTO, for example, they recently rolled out an electronic identity verification process. So that applicants and registrants for trademarks in the United States have to identify themselves. And in the Digital Services Act, there's discussion of removal of exemptions from liability if there's not contact information provided by platforms of the vendors that they're acting on behalf of.
So this really keys up desire on the one hand, privacy, and control over what's happening with one's information online. As, let's say, an ordinary user. And that same concept of privacy and anonymity applied in a commercial transaction sense.
And so one question is if there's a way to address this tension and on one superficial level, of course, there's the question of whether there can be a dichotomy between anonymity for private use and a lack of anonymity or information about vendors for commercial and public uses. It also deals with free speech issues.
So one of the things when I sort of look at this through the lens through which I operate my daily work, we run what's called the UDRP at WIPO. That's a way to tackle cyber squadding or infringement of their brands online. That was a process developed about 20 years ago and been working very well since.
One of the questions that's come up and discussed in a conference 20 years ago on the 20th anniversary of the UDRP, whether there are lessons that can be learned from the UDRP process that can be applied toward platforms. When you go online, for example, if I scroll down my Facebook feed or I'm shopping on one of the big platforms, you know, let's be honest. There's a lot of infringing and fake products out there. And there's, of course, possibilities to flag those up to the platforms. But what happens after that tends to be a bit of a black box. I don't know as a reporter if the good has been taken down. If the person has reacted to my notice to the platform. If I were the vendor. I'm not sure what the process is for dealing with that.
So what we hear from rights holders is that dealing with disputes on the platforms is still today seen as a bit of a black box. There's not really any clarity around the process that who are the decisionmakers. Are there counternotice possibilities, et cetera.
One of the things we looked at from the UDRP perspective is are there aspects of this global trademark abuse system that can be applied toward platforms, toward identities, toward speech, toward the various issues that come up with this intersection of the desire for privacy and controlling information and transparency for digital transactions on the other hand.
And so what we kind of arrived at was in terms of the lessons learned from the UDRP that I would say if we accept the premise that there's a foundation of a need to address these issues, kind of the core principles were clarity on process. So in the UDRP, you have several legal criteria that must be met. They're listed online in black and white for both the accuser and the defender to see. So there's clarity on both the legal and the substantive process. There's in the case of the UDRP, there's a human review of the complaint. Obviously, at Internet scale, that's not possible. So what kind of emerged was you need ‑‑ I know Stacey is going to talk next about AI issues. Some sort of automation. Even the automation principles, you know, agreement on the legal standards. The due process standards. Et cetera.
So if you are able to deploy some sort of automation for disputes that are arising, whether it resolves around speech or transactions online. If there's a need for an escalation process to where the sort of decision‑making process is plucked up from the automation then turned over to the neutral decision‑maker.
Then a core question, and this actually came up in an article written earlier this year published in the "Financial Times" was in terms of the question of due process, is whether there's also a benefit. Certainly, we've seen this to be the case in the UDRP. For publication of decisions.
Again, I mentioned, you know, if I see a fake set of something being sold on a platform, and I report that to, you know, Facebook or Instagram, you know, it's not clear to me what happens to that complaint. And so in the context of the UDRP, all of the now 55‑some‑odd‑thousand decisions that we published at WIPO are available online for everyone to see. So you can look back and see what was the decision‑making process. How was that arrived at. What's the result. And that helps to inform future decision‑makers. Future complaints processes.
In fact, in our case, we were able to produce a body of jurisprudence around that. But the core concept is clarity on the process throughout from the complaints to the decision‑making process to help inform people what are the rules of the road when we talk about rights and responsibilities in terms of digital transactions. In terms of, you know, questions of free speech moderation. And so on.
So with that, thank you very much, Lori. I'll turn back over to you and look forward to the discussion.
>> LORI SCHULMAN: Thank you. I'm going to move now to station Stacey King. I'll note we have a question in the chat. When Stacey is finished, I'll pose the question to the group.
>> STACEY KING: Thank you, Lori. I want to do a quick callout, today is Human Rights Day. So it's ‑‑ this discussion is coming at a great time. 73 years ago, the UN General Assembly adopted the Universal Declaration of Human Rights. Lots has changed since 1948. One of the fundamental changes we've seen is the transference of the individual from a mostly real‑world environment to a split residence with the real world and the digital world. And these are all the digital spaces that we live part of our lives in. Whether it's websites or social media. Mobile. Connected devices. Whatever it may be. Work now.
The Internet when it was originally envisioned, we see it as a way for academics to connect and to share content. Or some cases for governments to keep information more secure. But it has clearly moved way beyond that. That sort of way that we see the Internet. It's now seen as something that moves markets. And it is seen as a way to position countries and sort of global, political power.
And as we built these layers on top of the Internet that gives governments and companies and other people not only insight into what traditionally has been considered a part of our private lives and our physical beings, we're also seeing some level of control, knowingly or not, transferring with that.
The people who built the worldwide web, the people who built the Internet, were the devices, businesses that operate off it. They didn't have nefarious intent. Technology has been built on this concept of than we do something. It's a puzzle. It's curiosity. It's built off of development and innovation and the desire to change status quo. They didn't have dreams at the time of taking down democratic systems. Or monetizing human weaknesses for profit. These same companies and individuals, again, approaches it as a can‑we‑do‑it problem. Did they seek profit? Of course. Companies aren't doing these things for free. But did they seek profit at the expense of human rights? No. I don't believe that.
Until recently, many people didn't think of something like the UDHR as being applicable to a company's business model. Or to engineering architecture. Or digital spaces that they use. They saw it as more applicable to governments who oversee communities, people, and their lives. More and more, though, these lives are being lived online. Digital, and borderless spaces regulated by the companies that run them. There are many who understand the loopholes, the weaknesses, and ulterior uses of these technologies and systems and do use them at the expense of human rights.
For the average person, and that can be an individual, can be a regulator, it can be an engineer who's working at one of these companies or CEO at a startup. How and where we lose our individual rights in this space is really murky. It's hard to understand the technology, itself, what it can and cannot do. As we do move into worlds that are similar to what you're hearing now, the metaverse, where a person is truly a digital form and digital world but with real‑world implications but this idea of general artificial intelligence. Truly intelligent machines. We may one day turn over systems, operations, or a machine that makes military decisions. Or ranks citizens by their value to a given society by some algorithm. These systems, who builds them, what they're built on, what they collect, how they're used, by whom, and those safety nets against misuse by those with ulterior motives, becomes really, really important.
The technologies, themselves, have huge potential benefits. It's ‑‑ these things can cause, like, really great things for humans. But as the worldwide web has maybe taught us, they also have the potential to be destructive, if not developed and use the properly. This doesn't mean we should prevent development and use of new technologies, but rather, we should develop and use them in a manner that's guided by those pillars that the UN holds up of peace and stability, development, and human rights, which seems, you know, as a given. But it's really difficult. This is where it becomes difficult.
To do this, we have to actually inject, I think, a new cultural mindset from the one that's developed over the past three decades. So we need to move away from the idea of can we do something to a question of should we do something. Should we build these systems or use them in a certain way. It needs to be more proactive in the thought that goes into these developments.
Human rights are critical to this analysis. We've already seen examples of human rights violations that are linked to AI, for example. And this is early days of AI. Mostly in the way AI has been employed such as used by governments to identify protesters through racial recognition. Or in the underlying data used to obstruct algorithms. And we know that the encroachments, powered by digitization and big data, have brought ‑‑ I don't think it's an exaggeration to say that many democratic societies are being tested to their limits in part by the misuse of data to target individuals with misinformation.
So it's impacting our social structures. Our markets. And our individuals. How do we address these and make sure they become a fundamental piece of development and business models and digital systems? That's the big question. It's not something that can be solved by run group. Sam points out that need for collaboration. It isn't simply as easy as regulating the space. It's not as simple as stating, well, we shouldn't allow this technology to exist. It's not as simple as pointing at corporate profit or power as an evildoer. Nor on the flip side is self‑regulation the answer. Power can corrupt. We and leave it up to the people to read the policy and acquiesce. People have a base level of rights even in spaces they don't understand.
We're in an unfortunate place of mistrust right now. What we need is a combination, collaboration, of all the groups involved to sit down and agree there is a baseline we want to protect if these digital spaces. To brainstorm on how we do that. How do we build this in? It's a baseline of rights an individual has regardless of where they are and what they're doing. This is a tough thing to do. Everyone has gone to their corners and are insisting their way is the only way. And let's be honest. Some stakeholders have made better attempts at collaboration than others. And the mistrust can be for good reasons. We need to find ways to get past that. If we can't get those who fundamentally understand these technologies and how they're built to the table, we're not going to be able to do this.
Meanwhile, the systems at the UDHR helps to hold up, so peace and development and stability, they're eroding. It's critical that we take an honest look across all of our stakeholders. Not how much money we want to make. Or how strong we want to our militaries or philosophical theory for what the individual should want. Or an amazing solution that's not technically implementable. We want to take a look at basic human rights that help secure ourselves and our societies and translate them to the digital persona. It's going to take all of us agreeing on a base level of playing rules, as these things develop. Disrupt, innovate. All those things we like. Ideally, to make our lives better.
>> LORI SCHULMAN: Stacey, thank you. That was very profound in looking at the different aspects of the question.
We do have a question from the participants. I'd like to throw it out to the entire roundtable. I'll ask Bruna to go first if she has thoughts about this question. The question comes from Raquel Renno Nunes. Until the aspects mentioned here, do you think a digital Bill of Rights could help shift the understanding of universal connectivity from a mere development and economic aspect to a more freedom of expression aspect? Right to access information. Right to education. And currently, even right to health.
>> BRUNA SANTOS: Thank you, Lori. Thank you, Raquel, as well, for the question. Well, I think this has been pretty much on the agenda for the past years. Like, if you look at the UNSD Digital Cooperation Roadmap, there has been also a roundtable. A multistakeholder roundtable that was aiming to look into connectivity issues. And how were they talking to other things such as Raquel mentioned, access to information, rights, education. And there is this, at least, a streamlined position that these things are, like, all intertwined. But, like, there's also, to me, at least there's still the part of the implementation side. So not just implementation, but, like, how are governments, in fact ‑‑ what are governments, in fact, doing apart from just acknowledging that this is a right. That this is connected to freedom of expression. This is connected to access of information and everything else.
So I don't know how to answer this question. But I do think that this is a topic that has been, indeed, been in a lot of our agendas and initiatives for the past years. But I do also happen to think that this is an issue of implementation beyond just an acknowledgement. A general acknowledgement that we need to improve on connectivity and how this is also a fundamental right. And everything else. So I guess I would put it like that for now.
>> LORI SCHULMAN: Thank you, Bruna. Brian or Stacey, do you have anything else to add?
>> BRIAN BECKHAM: Nothing to add from my perspective except to agree, the short answer is, absolutely. More and more, the Internet is integrated into our daily lives. Absolutely, yes would be my reaction.
>> LORI SCHULMAN: Thank you, Brian. Stacey?
>> STACEY KING: Yeah. I would agree with all of that. You know, as more and more regions really come online, whether it is through a laptop or through a mobile device or some other means, and as systems are being deployed more and more behind the scenes that impact people's lives, that connectivity piece and getting people access is absolutely critical.
I would encourage as much as possible that we look at how we enable governments to manage this versus necessarily relying on private sector in a lot of these spaces. How do we, again, collaborate. I mean, keeps coming back to collaboration. How do we collaborate to try and get that connectivity out there.
>> LORI SCHULMAN: Thank you. Stacey, you said something that intrigued me. People have rights, even in spaces that they don't understand. So I think one of the challenges of not just the end user not understanding but governments, themselves, not understanding how the Internet works. We've seen testimony in the United States where U.S. representatives have asked tech industry representatives questions. And those questions, unfortunately, conveyed ignorance sometimes of how stuff works. And I want the group to talk a little bit about that. We're talking about very high‑level concepts of human rights. How do we connect how stuff works with these very high‑level concepts in a way that would translate this idea of, well, even if you don't know what your rights are, you have them.
>> STACEY KING: Yeah. This, to me, is the pressing question that we need to actually work out quickly. That is because when you look at what we see today in the digital space, you're right. People don't understand it. There's a small part of the population who understands how these things operate, how they connect, what they're built off of. Even within a lot of the companies, people don't fully understand how it all works. And it can be really technical.
As we move into, you know, you hear about things like a metaverse which isn't likely going to be built by one company. It's likely going to be built by a wide variety of companies and individuals off of a variety of technologies. As you move into things like AI where it is very, very easy in a short period of time to lose a sense of how things were built. What they were built off of. And the bias that went into them. You know, there is bias that goes into these systems. It's not all harmful bias. But that's the key. Like, how do you track down those harmful bias. Trying to explain that to a group of people, it gets more and more difficult. And as we build these systems up, it's going to be harder for us to document and track.
And that discussion right now is critical. How do we educate people with a base level of knowledge to really understand what they are giving up and what they're gaining when they use these technologies. When they're built, how do we build it in a way that is responsible. That, if we can't get that right and get that right soon, I think we're going to see some real problems.
>> LORI SCHULMAN: I'm going to interject with a thought that I feel like the capacity building is, you know, so much ‑‑ very often, capacity building is discussed in terms of end users. I think there's a huge need for capacity building at the governmental level. At the regulatory level. Of having those who actually legislate rules understand what they're legislating. You can fall into ‑‑ we've seen it already with some of the privacy laws and some of the questions being raised. The European Union's proposed AI regulation. I'm going to ask Brian or Bruna if they have additional thoughts, anything else they'd like to chime in about.
>> BRIAN BECKHAM: Thanks, Lori. One thing. What Stacey mentioned gives a lot of interesting food for thought. One of the things that kind of strikes me, you know, just kind of looking at the news these days, you know, in terms of collaborating and shared values. Unfortunately, I guess I would push back and say I'm not sure that we're even there. And so that kind of begs the question, you know, one person's, you know, free speech is offensive to another person. So how do you deal with that, you know, and especially when the lines between what happens online and offline are increasingly blurred.
You know, if you look at the, you know, the GDPR Recital 4, it talks about the fact that privacy is not an absolute right. Has to function in relation to other rights in society. In the U.S., we have this concept of you can't yell fire in a crowded theater. I think maybe it's high time for people, legislators, to put down some markers on what are the boundaries on acceptable conduct. And that might help frame the discussion, you know, going forward. Thanks.
>> LORI SCHULMAN: Thank you, Brian. Bruna?
>> BRUNA SANTOS: Yeah, maybe just building up from there. I think in Brasil, not just in Brasil, I guess all over the world we're kind of seeing this moment of legislators setting some boundaries. Like, we have all, like, when we look at things such as the whole discussion about platform governance and intermediary liability and everything related to that, we have been changing a lot in the ways we all frame this discussion. Because you can say that the majority of this debate around, like, I don't know, platform governance, started around Section 230, e‑commerce directive. How we didn't want to be too incisive over the Internet environment and companies and such. But now looking at the Internet we have in 2021, we can also look back at legislation such as the Brasilian one and say, well, that was a good compromise for the time. Now there's this increasing need for us to continue developing boundaries or lines. Still with a human‑rights‑centered approach. And also still upholding multistakeholder participation we speak so highly of.
There is this changing, kind of, like, a new tone to how can we regulate the Internet, indeed. That is also something that's very interesting to see right now. I mean, speaking about very bad regulations and must carry obligations such as the one Bolsonaro tried to implement in Brasil. I do agree there's kind of a change in the tone.
>> LORI SCHULMAN: Thank you, Bruna. We have three minutes left. So I'd like to give the room an opportunity to ask a question. And then we'll give the final minute to Professor Lanfranco. Is there anybody in the room ‑‑ we're so pleased to see so many in the room, to be honest. Some of the rooms I've clicked into there's been one or two people. It's great to see so many people. Thank you. If you have any questions, please physically raise your hand. Bruna will call on you. Yes.
>> BRUNA SANTOS: We have Yari here.
>> LORI SCHULMAN: Thank you, Yari. Please go ahead.
>> YARI: Hello. Yes. I just want to note how could you, like, how could you address questions or responsibility when it comes to, for example, automation decision‑making processes that they need to, for example, attend people (?) in health or maybe they need to make a decision of some process. And they, and a person or patient or someone that they are, like, giving the service. So I would like to know how could you relate the human right violation and how the responsibility could be addressed in these kind of cases. Thank you.
>> LORI SCHULMAN: Does anybody want to take that question? Sam, you can take it, too, if you choose. It's not limited to the roundtable. Anybody on the panel? Klaus? If you'd like us to follow‑up with an answer, we can do that as well. Klaus? I'm sorry.
>> STACEY KING: I would say that's the big question. Not just in terms of medical use but I think in all of these technologies as they become more and more interwoven. Into sort of what we traditionally have seen as human activities, medical advice. Those types of things. Those are the places we really need to sit down and say, okay, how do we balance, you know, basic human rights and basic regulations that we already have for those human activities and carry them over into that digital space. It can't be a free‑for‑all. Because it is interwoven with our lives now. In fundamental ways that we don't necessarily see.
When you do a Google search or search engine search right now, you're proactively putting information out there and you know what you're typing in and you're getting some results and can choose from it. When you're getting medical opinion on something, you may not know in the background that it's an algorithm that reviewed the scan of your lungs or whatever it may be.
And so we do need to look at are our laws and regulations and industry ethical guidelines and all of those things, are they fit for purpose? And if they're not, and if they can't be extended to it, how do we remedy that? What do we value?
>> LORI SCHULMAN: Thank you, Stacey. Colin Hayhurst has a question. If you can ask it quickly. We'll close out the discussion and remind the session will continue. Klaus dropped a link into the chat and can go over to the link. Please go ahead.
>> COLIN HAYHURST: I have many questions. I'll join the chat afterwards. A quick one, I guess, would be picking up on Klaus' point about digital Bill of Rights. I would suggest, you know, things are happening too fast. We can't wait another decade for changes. We need to have companies, special interests, respect the rights that are already there, which are being broken.
So I guess my question would be, Klaus mentioned digital governance is dominated by special interests. Not the common good. How would you propose to bring those special interests into that process? And have them participate in a constructive process.
>> KLAUS STOLL: I think the answer to that question, I want to answer it from the point of view of (?) of the last two years, basically writing the series on UDHR and Internet governance and looking at each Government of UDHR. And to be absolutely honest, I began that series of articles thinking, okay, we need to rewrite the UDHR somehow to fit the digital age. And the more we looked at it, the more we got, we got gobsmacked and in awe, how applicable the UDHR is actually for the, for cyberspace. It's not a question of writing something new. It's to translate one thing, one into the other. How to extend the UDHR into cyberspace.
That brings me to the central question which Bruna asked right at the beginning. Who's implementing it? Who are we entrusting it? That's a good question.
What we found out, there's a question before, how can we get these governments, how can we get these private sector entities, how can we get Civil Society, to take and respect the UDHR as the ultimate standard. And then act based on that. It's not about who. I think the players are in place. And special interests have an absolute right. I respect special interests. But as long as they are exercised inside the framework of the fundamental human rights. Simple as that. There's nothing ‑‑
>> LORI SCHULMAN: Klaus, I'm going to stop you there. One more sentence. Because we're well over time.
>> KLAUS STOLL: It's okay. Thank you.
>> LORI SCHULMAN: Your final thought. I'm sorry. The IGF is running on a tight schedule. They will just cut the link. And that would be a pity to happen.
So I want to thank everybody. I want to encourage the people to go to the Zoom room that we posted in the chat.
>> KLAUS STOLL: Just give me a minute to open it. Be patient.
>> LORI SCHULMAN: Yeah. Okay. Klaus will open it. Klaus, you can sign off and open it. That would be the easiest for you. I want to thank everybody. And very much looking forward to this discussion in the next few minutes. And over the next course of the year and onward.
We want to thank the Government of Poland. The MAG. The organizers of the IGF. This has been really a wonderful opportunity to discuss some fundamental issues that people have been thinking about and will continue to think about moving forward.
So have a great rest of your day at the conference. We look forward to seeing you next year when hopefully we have a Part 2. Thank you.