The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> AMINE HACHA: Good morning. Can we try it with you, you hear us, but we don't hear you.
>> PEDRO de PERDIGAO LANA: Still nothing? Testing, one, two, three.
(Pause).
>> PEDRO de PERDIGAO LANA: Hello, Amine, one, two, three. Testing.
>> AMINE HACHA: Okay.
>> PEDRO de PERDIGAO LANA: All right, you can hear us?
>> AMINE HACHA: Good morning, you hear us and can you speak try your voice?
>> PEDRO de PERDIGAO LANA: Yes. Are you hearing us correctly?
>> AMINE HACHA: Pedro.
>> AMINE HACHA: Yes.
>> VINT CERF: It's Vint Cerf, can you hear me?
>> PEDRO de PERDIGAO LANA: Can you give us a positive sign if you're hearing us.
>> VINT CERF: We can hear you online.
(Pause).
>> I think now we can hear you guys, perhaps.
>> VINT CERF: Terrific, we're glad to hear that.
>> Good morning, Vint.
>> VINT CERF: Good morning.
>> FARZANEH BADII: Well thank you, Vint, for making the time. I did not know you were not on the ground. It must be late there.
>> VINT CERF: It's about 1:30 in the morning here in Washington, D.C. Unfortunately I wasn't able to travel the last minute there were some problems that arose. But I'm happy to join you this way. So thanks for inviting me to participate.
>> I appreciate it a lot. Also for the two of you, Farzaneh, so somewhere between good evening and good morning.
>> VINT CERF: Yes, that's why I have my coffee with me.
>> AMINE HACHA: Good morning again. Sorry for maybe technical problem, but we can fix now. Good morning again. First, we try with Zoom, you hear us?
>> FARZANEH BADII: Yes, I can hear you and my audio's okay.
>> AMINE HACHA: Okay, we will start.
Good morning, good afternoon, and good evening for everyone. Welcome to the session titled Noncommercial Users Constituency role in a safe Internet. It's part of IGF 2024 which focus on advancing human rights and inclusive Internet in the digital age. It's organized by [?] my name's Amine Hacha, and I'm happy to moderate the session. I'll give it to Pedro de Perdigao Lana who is the online moderator.
>> PEDRO de PERDIGAO LANA: Hello, everyone. I will be moderating online, so please feel free to ask questions and make comments in the chat if you want them to be read aloud, just indicate it in the message and I hope we all have a good session.
Amine, back to you.
>> AMINE HACHA: Thank you, Pedro.
This session focuses on essential and the role of noncommercial user in a safe Internet. Our speakers will guide us through the conversation and together we'll explore the challenge and success of noncommercial users advocacy, shared case, and try to identify actionable strategy to create and keep a safer and more inclusive Internet.
I encourage all of you both in the room and online to actively engage with our panelists and bring your experience to this conversation.
Let us use this opportunity to connect, collaborate, and amplify the voice of noncommercial users as we work together in a truly inclusive digital future.
Let's start. I'll start with our first speaker, Vint Cerf. I want to thank most of the speakers now, it's morning time and getting from your time in this early morning on your side.
Vint Cerf is the vice president and Internet of Google. I will give you the floor, please.
>> VINT CERF: Well first of all, thank you so much for inviting me to join you today. As I think about noncommercial users, I think about all the other users and I think much of what you want is what everyone else wants too. You want a safer Internet, among other things.
I bring you greetings from the leadership panel. I can tell you that it is our hope that the IGF will continue beyond 2025, as you know the next meeting is June in Oslo in Norway. We have communicated to the Secretary General of the UN that it's our hope that the World Summit on the information society + 20 will renew the mandate to continue for another mandate.
I also would say that it's our belief that the IGF can be an important element of the Global Digital Compact in the sense that we have for the last 19 years been evaluating the access to the Internet and its utility and reliability. And the Global Digital Compact will need to be scrutinized for its successful implementation.
There is no better place than the Internet Governance Forum for that evaluation and analysis.
I will start out by saying that as I look at safety in the Internet, I've come to believe that there are two very important elements that need to be part of our universe. One of them is accountability, and the other one is agency.
Let me explain. If we're going to have a safe Internet, it is vital that the parties who are using the Internet and providing Internet services in applications be accountable for their behavior. That's true of individuals, it's true of organizations, and it's even true of countries.
And so I believe that we have to incorporate into the way we operate elements of accountability so that parties who are doing harmful things can be held to account.
By the way, that requires international cooperation, because the Internet is global in scope and all of you know that the packets that flow through the Internet are insensitive to international boundaries. It was designed that way deliberately. Which means that victims can be in one jurisdiction and the perpetrators could be in another. And we need international cooperation, again, to achieve accountability.
By agency, I simply mean that we, as users, should have the ability to protect ourselves and to call for help if we need it. And so those two things, accountability and agency, are very important elements.
On top of all that, especially for the noncommercial users, affordability is a very important item. Because we're noncommercial, that means that we don't always have the same facility to pay for access to the Internet and its applications. And so that's an important element that needs to be part of our consideration.
Some of you will have heard my story that I sometimes think we should invent an Internet driver's license. I don't really mean that you have to be issued a license to use the Internet, but many of you will know that before we give young people keys to the car, in most countries there's some training that's required so that these young people can show us that they understand the rules of the road.
And in a sense, we should be training people to know what the rules of the Internet road are and be assured that they know how to behave properly and safely in an online environment.
And that the analogy here is an Internet driver's license.
The broader objective for us, of course, is human rights protection in the online space as well as in the physical space. Those two protections should be essentially the same. We should protect human rights wherever they are potentially threatened.
Among the things that can help us achieve these objectives, I would say that strong authentication and provenance are two very important elements.
What do I mean by that?
We need to be able to authenticate sources. We need to be able to authenticate parties in a strong way so that we know with whom we are dealing, so that we know where information has come from. That helps us evaluate its quality and its utility.
So tools that allow to us achieve strong authentication and provenance would be very helpful in the context of noncommercial user access to the Internet.
There are technologies that can help us in this mission. The hypertext transport protocol secure version, https, the domain name security system, DNS, the BGP are among the various mechanisms that can help us achieve more, safe, and secure environments.
By the way, I'd like to draw a distinction between security and safety. Just because something is secure doesn't necessarily mean it's safe. A classic example of this would be a highly secure electronic mail system which encrypts the traffic while it's in transit, it gets decrypted at the end and there's a phishing message waiting for you to trap you.
So we need to tend to both of them, both both privacy and security, as well as safety.
But one of the ways to make ourselves safer is to take account of where software comes from. And after all, this entire system is based on software. And people who are doling products and services, whether it's IoT, Internet of Things devices or application on the worldwide web, will often turn to open source software. For one reason, it's inexpensive, it's essentially free, and that makes it convenient to use.
The problem is that sometimes open source software is not very well supported. There might be just one individual who's contributed his or her software, but isn't in a position to maintain it over a long period of time.
Why is that an issue? Well, often the fact that it's open source doesn't necessarily mean that it's bug free. And in fact, often open source has vulnerabilities in it. So one of the things that we need to remember is that when we're using open source, we should scrutinize it for potential vulnerabilities and evaluate it with that in mind.
So let us not use open source blindly, let's use it with some degree of care.
With regard to the general utility of the Internet and assessing its utility, yesterday I participated in a UNESCO meeting here at the IGF, and they drew attention to what they called the Internet universality indicators or IUIs. They started out with a lengthy list that could be used for its utility, and they compressed those metrics to make it easier for countries or parties to evaluate Internet access.
I would say that the noncommercial users group should look to those Internet utility or universality indicators and consider exercising them to evaluate the Internet service that we are all getting.
I can't overemphasize reliability as an element of utility in the Internet. Any of you who are heavy users of the Internet will know that when it doesn't work, you have potential cascade failures. For example, you want to log in to get to your email but your mobile has to give you a second factor for authentication. That's security at work. Except your mobile isn't working because the battery is dead or broken or maybe you didn't get a signal, so you can't get logged into your email and that means you can't get the message that you needed in order to keep your company going and suddenly you go bankrupt. That's an extreme case of cascade failure, but we are increasingly dependent on Internet services for our daily work and living and reliability becomes increasingly important.
Finally, I just came away from the Policy Network on meaningful access, and I would like to urge the noncommercial users to participate with and collaborate with the PNMA. Their work goes on during the course of the year, and I think it's very important for IGF to have a continued activity during the course of the year leading up to the annual meetings and leading up to the national and regional meetings.
So let me stop there. I hope I haven't bored you to death. I just want to emphasize how important your work is to all of us who care about Internet as a useful tool for our daily living and working. So Pedro, I'm returning the floor to you, I think.
>> PEDRO de PERDIGAO LANA: Thanks, Vint. Just let me check here. First of all, thanks for overview on the safety issues and priorities that the noncommercials should have as a group and as a stakeholder group mostly.
I think as to maintain the authentication between online and on site audience, we can speaker, sorry, we can go back to Bruno Santos. Would you like to take up the mic?
>> BRUNA MARTINS dos SANTOS: Yes, Pedro, thank you. Just as a note, I'm joining this session as part of the digital community, right, and then a member of the NSG, but also a member of the MAG. I have been working for IGF and shaping messages around the broader and the next steps around the [?] review, renewal, and many other things.
Going to the first question, I think I would start by saying that the main role of noncommercial entities or speakers is to make noise, right?
As good as our case and topics and themes, I do believe that we're looking at a stakeholder group that unfortunately sometimes we need to shout even louder to be heard and we need to shout in a very precise and direct way about what are the harms, what are the issues affecting us and so on.
And that's because we don't hold business interests, we don't hold original competence for policy making, we're participants in the process, right? And the shouting loud part, it also entails convincing government to hear to Civil Society voices. It entails that we need to talk to business and the private sector about our concerns around product development and some issues that might arise from the implementation of new technologies.
And you know, we need to keep on doing that so the problems continue to be solved and so on.
I do think that besides shouting loud, I do see Civil Society or noncommercial entities with a very good and relevant oversight role, right, in which we do help each and every stakeholder to keeping checks with their mission and to make sure that no product or policy developed in this space, especially for ensuring the safe Internet, would result in human rights violations, would result in Internet fragmentation or fragmentation of the user experience in general, right?
So I'll stop now, but just for us to keep continuing the [?]. But just summarize in that way, shouting loud and making sure every single stakeholder is in check with their mission.
Thanks.
>> AMINE HACHA: Thank you, Vint, for your speech. And thank you, Bruna, for your comment. Our next speaker is the leader with over 27 years of expertise in public policy, governance and digital transformation. She serves to enhance transparency, public resource. She also has the senior role at Lebanese Ministry of finance. Her contribution extends to Saudi Arabia, where she implemented advanced automated system to improve governance and sustainability.
Her dedication make her a voice for transformation.
>> Thank you, Amine, I'm glad to speak here at the IGF and to participate. I thank you for the invitation.
I start where Vint Cerf has ended. He mentioned about the Internet driver's license. I like the idea about human rights protection and the reliability and mainly accountability. The main issue is that every one of us is accountable of what we receive and what we share. Because sharing is also a responsibility not only not only writing our own posts, that new business model has changed and it is it's not about receiving content like TV, like the newspaper, like radio. Now we add participating in the content. We add creating value in the content. So everything will definitely influence the community and influence the Internet world.
So new readers now interact, they put comments, they put likes, they put reviews and there definitely can influence a program, a journalist, an idea, a media output. They can influence also the visibility of these outlets.
Drawing from my experience as minister of information in Lebanon, I have witnessed transformative power of the digital technology in advancing some of the most pressing challenges of our time. And I think by that time we had the major crisis that hit Lebanon, we had the COVID 19, we had the Beirut explosion, and we had the economic crisis. And these were unprecedented events and crises. And in these moments we turned to digital platforms.
What you couldn't do in case, we managed to in a few months. This is important, we forced the public participation, transparency, and we tried to deliver wide information to citizens.
Also, this is very interesting in collaboration with the UGNP, WHO, UNICEF, we had a hashtag, take care before you share. This is mainly UGNP. This is being accountable by not only receiving the content, but also by spreading the content. And we all know that we are all influencers, so if anyone shares, for example, a message Whatsapp to someone else, these people will trust the message because it's coming from you, not because [?]
So sometimes they share the message from you without reading it. And here we can create some issues if we don't if we don't if we are not responsible.
I believe I will conclude with three critical areas. We're users, and here I'm talking about noncommercial users, they should engage meaningfully in the world of Internet. First of all, awareness and education. Knowledge is power. So users must educate themselves on safety, on the impact of misinformation and disinformation and you should know the difference between misinformation which is unintentional behavior and disinformation with we manipulate the news.
And also the second point where commercial or noncommercial users engage is advocacy and collective action. Communities can unite to advocate for stronger policies and protecting privacy against cyberthreats, ensuring net neutrality. And here we all know about the lobbying that we have by, for example, digital petitions or grassroots campaigns. This also can amplify the voice of noncommercial users.
The CERT critical issue where users are engaged in participation in governance. So you know, basically we have the governance from one side from the government, from the cabinet, from the Parliament, nobody nobody is engaged. Now we believe that people, they have the power to be engaged in public policy, so they can contribute to the policy discussion and they can ensure digital policies reflect diverse needs, not just those of the government or those of the commercial entities.
And beyond Lebanon, I have big project experience where also I would share related, if we have time. Let me conclude by emphasizing that a safe and empowering Internet requires that active participation of everyone is needed, especially noncommercial users. Because these ones, as I mentioned, are very accountable and responsible nowadays.
Their voices and actions are critical in championing an Internet that's secure, that's inclusive, and will remain mainly, the vulnerable people we have to protect them by bridging the digital gap and Internet in this case can ensure they're good for everybody of us. Thank you again for this opportunity. Thank you, Amine, and back to you.
>> AMINE HACHA: Thank you for your speech and our next speaker is Farzaneh Badii. Additionally, she's founder of [?]. I don't know what to say about Farzaneh. She has a lot of active things in digital rights and policy and of Internet policy. I follow her in many working group, we work together in ICANN and other part. I will give you to the speech for Farzaneh to tell us about.
>> FARZANEH BADII: Thank you, Amine. Thank you so much for the invitation. And so as Amine mentioned, I'm involved with I've been involved with Internet's infrastructure and the governance aspects as well as human rights and digital right for the past for over 15 years. But also I have worked a lot on issues such as digital trust and safety.
And I'm a I can call myself an ICANN veteran now. I don't know if that's necessarily a good thing, but I've been involved with the process since I was young, and a member of the noncommercial stakeholder group and noncommercial users constituency.
And I can tell you a little bit about evolution of how the noncommercial users interest and participation was affected, how they were empowered or how they were disempowered during these processes and why it's so important to have this voice in especially in multi stakeholder processes.
And we appreciate Dr. Milad and also Vint and Bruna's interventions and it is a very refreshing to hear about the importance of digital literacy these days that we are just going down the lane of regulation after regulation and not and leaving people behind.
So one of so we argue that the Internet is for everyone. So should be safety. So when they are when we provide meaningful connectivity for people, we should also keep them safe online. And that is, of course, challenge.
But the also the means that we use in order to keep them safe should not affect and diminish their online presence. It should not affect their human rights. We should not, in order to keep them safe, we should not give up encryption. In order to keep them safe we should not we should not have unaccountable processes.
And one the role that noncommercial users have, they have a very multifaceted approach to safety and access to the Internet. They use various frameworks in order to provide like that advocacy, like for example they use human rights approaches to evaluate whether processes governance processes on the Internet work and whether that affects safety and accessibility of the Internet.
They also, unlike governments and nation states, Civil Society's work is usually global and transcends borders. It doesn't matter if you're a user sitting in Kabul or in Brussels, your safety and your access to the Internet is a matter of importance to the noncommercial users.
And noncommercial users, I should mention this, noncommercial users usually consist of Civil Society organizations, but we also have technical operators that truly care about human rights and meaningful connectivity and they're not for profit and they join us in these efforts.
Also any process that claims to be multi stakeholder must have the noncommercial and Civil Society organization in these processes. Otherwise, it wouldn't be that much different from bilateral processes or private public partnerships.
To some extent, Civil Society courses and noncommercial users might not use the tried and tested punitive measures that law enforcement agencies and others used in order to bring public safety, but affected humans rights.
And punitive measures that can have devastating effects on various aspects of digital governance, including access to the Internet. Including security and safety of the Internet. And I can tell you a little bit about what NCUC does and what we do and our successes, but I feel like that could feed into your final question and so I'll just stop now. We can discuss that later.
>> AMINE HACHA: Thank you, Farzaneh.
And we'll move now to our next speaker, Dr. Bali. Managing director of global learning and the leader in eLearning solution. He's been a keynote and checking the face of virtual education, as we know it today in the Middle East. As an international figure in the area of AI, robotics and digital transformation and technology enabled learning. And the fingerprint of the [?] was and continue on different side in our region of Saudi Arabia and all the region and internationally.
I will give you the speech, doctor.
>> Thank you very much and good morning for everyone. In addition to what has been covered by the colleagues, I would like to shed a light on the local aspects of this global trend of policy development by various people, as my Farzaneh just mentioned, that the policies that are developed globally can benefit other people in other places, but it's not always the case and there are three aspects that I would like to highlight based on our actual problems in the past 25 years in implementing solutions that are relying on the Internet, specifically in education and human resource development and upskilling and so on in different areas. Covering different types of noncommercial users, including marginalized people or even refugees.
And we have to face a lot of issues in the past 15 years probably or more in different Arab country, and the tendency was that underprivileged people should not have a solution that's based on technology. And our hypothesis in 20 years is that technology is not only for people or more advanced countries, but it can also provide efficient solutions in different (broken audio).
Including in places with (broken audio).
So what I would like to highlight and later we can go into more details are some have been covered, but I would like to talk about them from a different angle. Privacy, equity, and accountability. Accountability covered from a certain angle, I would like to cover from private as well as governmental for different areas.
But first let me start with privacy where, of course, we need stronger privacy policies. We've been governance jumping in all kinds of new technologies and regulations without having the right policies in place. Many times some policies have been copy and paste.
(Lost audio).
Okay, so okay, so maybe now this is better. So some privacy laws in different countries are badly needed. There are some international standards like GPR or others but we don't have such things in different places in the world and we have to take into consideration the local, cultural sensitivities when we do that.
Also, privacy tools and different types of available tools as it was mentioned before, many people would to for open source because of affordability, but many times this is not 100% safe.
Also surveillance. I mean, we have seen a lot of surveillance efforts in the region by governments or even by Internet providers, and those all the way to using like Pegasus spyware to see what people are doing. And this is affecting human rights and freedom of speech sometimes.
Also, as Dr. Milad mentioned, we need to educate people about their digital rights. Sometimes we take it for granted that people know what they want and therefore we ask for noncommercial users to be to have a major role in developing policies or drafting new policies or whatever while they are way behind. Many people, especially in underprivileged places and in different countries, they are way behind what we expect them to be doing. So capacity building and education is a major point.
Another very important point is equity. And this is, again, digital divide is not only about accessibility of the Internet, which is still very important point, especially in areas like war areas or refugees. In the region we had millions of refugees when we started talking about educating refugees, the only method that UNICEF know is to bring them to physical places and teach them in conventional ways. When we said let's go for some kind of blended or online or AI based modern technology to reach out to these people and have these individualized solutions for them, it was new for even international people.
So many times the local development or innovation is by far more advanced and flexible than archaic, international solutions that are repeated everywhere, copy and pasted from one region to another.
This is another area when we talk about policy you have to take into consideration that it's not what has been decided in advanced countries that this is the policy that goes everywhere. So local is more important in many other areas, especially that when we go to some kind of multilingual access or even freedom of expression and so on, we have to and the access of the end users even students or other underprivileged people to different types of content, it's not only to block some kind of content for cultural sensitivities, but also to allow, as Dr. Milad said, to amplify the opinion and innovation or creativity that is done at the local level by such noncommercial users.
And accountability, I just highlight a few things. One is that transparency reports are lacking in many regions in many corporations or governments in the region. How data is collected, where is it stored, and how it is shared and who has access to share who has the right to access it.
There is a lack of independent audits and the policies sometimes they mention, sometimes they don't mention this. And nowadays in the last two, three years when AI became a major tool in the digital transformation and even digital literacy is more of an AI literacy as well, so there's a need for local AI ethics frameworks. Not just copy the UNESCO framework which means nothing in many of our countries, and you need to develop some local flavor, taking into consideration cultural sensitivities as well as the actual situation on the ground in terms of infrastructure, access, and all this stuff.
And finally, there is a need for surveillance accountability by corporates or by governments who do different types of surveillance that we don't know what they're doing and suddenly we discover after some time that millions hundreds of millions of dollars have been spent to control what people say or to gauge what people say also on social media.
So this is a brief of our experience, I can talk more about educational part or our public experience if we have time.
>> AMINE HACHA: Thank you.
When Bruna started to talk, we don't introduce her very well, I think, but we work together with Bruna and she's an ICANN counselor and IGF [?] and also Bruna have a lot of active work and Internet policy and I have a question for Bruna.
It's about how can the NCUC and its partner account for equitable and inclusive Internet Governance that respect freedom and expression while countering the overreach of both state and non state actor in cyberspace?
>> BRUNA MARTINS dos SANTOS: Thank you. And I'm appalled by the conversation in the chat. I don't think that's the case but I wanted to put this on the record.
Going become to the question, I think, I'd like to start by maybe stating that we are living in unprecedented times, right, where just not the discussions about Internet Governance, but also our spaces, the IGF, ICANN, some of our spaces have been challenged by governments, their interests in this conversation. Other stakeholders, and also other processes taking place around the globe.
In 2024, we have seen challenging processes like elections in more than 80 countries and also some threats regarding the duplication or fragmentation of spaces where Internet Governance discussions take place.
And this has all been shifting towards different approaches or different ideas on about what's the ideal approach to Internet regulation, what's the ideal approach to policy making. And I'm very glad to hear about the interventions about this point.
In light of that, some of the challenges I think are things NCUC that we can also work on would be first of all promoting multi stakeholder governance framework. It's our role to engage in these spaces and make sure all voices heard in this space. I'm not just talking about ICANN, I'm talking about [?] with its guidelines. I'm talking about engaging in ITF, ITU, and any kind of space that addresses Internet Governance by any mean or any kind of surface way.
And by doing so, I do believe that noncommercial users need to also advocate for more transparency and accountability, and that our participation in this fora is that the same kind of level or in some acknowledgeable level to other stakeholders.
What we saw in the GDC process with the side tracking of the technical community, Civil Society voices was definitely concerning. But that happened despite our concerns and point about the process.
Another thing I think noncommercial users can do and NCUC has been doing a lot is to challenge overreach in content regulation. We are really strong advocates, Farzaneh one is of them in ICANN mostly where every single time stakeholder groups within the space try to advocate for more content regulations, we're kind of the ones that keep raising the flag of we don't do content in this space.
But besides ICANN, it's just really relevant that any sort of policy approach is human centered, transparent, and accountable. And we also have proportional state actions, and proportional state actions that have Civil Society and academia concerns in mind.
Other than that, I would say advocating for corporate accountability is high and should continue to happen. Pressuring private intermediaries within our scholars within ICANN to adopt due process mechanisms, human rights based approach and assessments or anything that can help us as to the type of policy making they can do.
That comes aligned with defending freedom of expression in the DNS ecosystem. That's something we have been really keen on. But also going back to the points addressed by other panelists, I'd say that supporting digital inclusion and equity and also educating and empowering stakeholders is a key and core aspect of our mission, because there's a lot of capacity building Civil Society folks and groups can do and to provide for our spaces for other organizations to come in and other organizations to also be more well experienced within ICANN.
And I would maybe just close by saying that there's definitely not a one size fits all for this measure. As we face discrepancies on how states approach the broader idea of safe guarding users and ensuring human rights align.
Unfortunately, we don't live yet in a world where we can simply ask for regulations because it goes both ways, right? In my country, Brazil, it comes in order to safeguard and make sure everyone has access to Internet, but in some regions of the world it talks about using transparency to surveil users or to ask for companies to share data with governments and they're, like, unlawful or abusive requests in that sense.
But I do believe that the power of advocacy and Civil Society can be a main tool for making sure we reach a safer online space. And in hearing the voices of users, but this conversation doesn't go anywhere if we don't address the voices of the victims. And these are the ones who gets the rights violated, these are the ones who have, you know, their private lives shared on social media, are victims of deep fakes or anything like that.
I do believe we have a lot of work do in including more and more voices, but above all, the victims are also relevant in this conversation.
So I'll stop here. Thanks.
>> AMINE HACHA: Thank you, Bruna.
I will give the time now to Pedro to if there are someone online or any comment. I see comments now.
>> PEDRO de PERDIGAO LANA: We don't have exactly a comment, we have a conversation going on about encryption, privacy, and possibility of attenuating privacy mechanisms, encryption mechanism as to make it easier to prosecute CSAM or order [?] online.
And I think considering that [?] should like to talk a little bit about this issue regarding lower or higher levels of encryptions and how noncommercial interests are involved in this this kind of conversation.
So please, you have the floor.
>> FARZANEH BADII: Pedro, that's a very like this is a topic that we've been discussing for a very long time. And I think that's I don't think time would allow. So, but let me let me tell you a little bit about what we do for privacy at Noncommercial Users Constituency.
As you know, Noncommercial Users Constituency is a part of ICANN governance. And for a long time, we had the domain name registrant, sensitive and private data out in the public. And anybody could have an access through who is to the domain name registrant's data.
And one of the so as NCUC, we were advocating for privacy. But we were not advocating for just redacting the data or not collecting the data, we were saying that don't make this public. If you need access, have accountable disclosure policy processes in place in order to have access to it.
So this is and we were not too successful in bringing privacy domain name registrants until GDPR happened. And then after that, we managed to advocate for privacy for every domain name registrant in the world. Which to some extent happens and we are we are we claim that as a win for us and as a success.
But but another thing that we should we should be cognizant of is that when like many of our members, many of the noncommercial users, many of the organizations work on these safety issues. They work on online safety. It doesn't mean when they advocate for privacy, when they advocate for freedom of expression, when they advocate for strong encryption, it does not mean that they don't work on trust and safety problems with information integrity and other issues are being raised online.
So, but we are of a we have a position that we need privacy globally for everybody. We need a strong encryption. And when we want to do when we want to bring safety, those principles should not be undermined. Because then we are not bringing safety to the Internet and Internet users.
I'm sorry, I didn't address the whole encryption thing. But I think that can be a segue to talking about how can we actually not undermine privacy and security of the users online but bring bring safety as well. And I think one way is by coming up with alternative solutions.
Alternative solutions, not to encryption, but alternative solutions to how are we going to combat CSAM? How are we going combat content moderation issues?
Thank you.
>> PEDRO de PERDIGAO LANA: We have a question of one to I think all panelists. A question about you panelist would consider that encryption is a leading cause of less safe Internet.
I would also like if you are interested in contributions to that discussion, since I'm seeing that there are some in the chat, Vint, about the encryption challenges in other challenges regarding noncommercial interests that may arise globally that we need to halt or we need to deal with to ensure safe interests. We would like to hear your contribution on that issue as well. Be it encryption or the challenge that we're facing nowadays that are important on a global level.
So Vint, I would like to give the floor back to you.
>> VINT CERF: Thank you. I've just been trying to track the conversation in the chat as well as the online audible discussion.
It seems to me that encryption is a really important tool, and for people and organizations that need to protect information, I think we can't escape from the utility and value of Cryptography.
At the same time, though, I think we also have to remember that sometimes we have to be able to pierce the veil of anonymity in order to ensure accountability.
I would ensure that law enforcement, although some members of that community say than incorruption is somehow harmful to safety, many of them seem to have found ways of dealing with bad behaviors despite the existence of Cryptography.
And so I would encourage an argument along the lines of alternatives to things like backdoors and so on in order to deal with accountability and law enforcement.
Sometimes cooperation helps. Sometimes, frankly, penetration of some of the groups that are doing harmful things online, like ransomware, turns out to be more effective than trying to eliminate Cryptography, which has all kinds of side effects with regard to safety and security.
I think there are others who are online right now that might speak to this as well, but I leave that to your discretion.
>> PEDRO de PERDIGAO LANA: Amine, I'll give it back to you. I would like to read two comments. One comment and one question from Benjamin. Benjamin concurred with Dr. Milad answer to have frameworks band necessity of having local approaches in different countries which may enter into contribution to global frameworks sometimes.
And there was a specific question on how do we have meaningful and noncommercial participants if we have Internet capacity in noncommercial community.
But before going back you to, I see that we have a hand up, so I would like to give the floor back to her and then doing back on site.
>> FARZANEH BADII: Thank you, Pedro. I just wanted to quickly mention that so anonymity is very important for some vulnerable communities. This is how they got their voice heard in the beginning when the Internet was displaced that they could, without having to mention their religion, their identity, their name, they could go, they could join online communities.
And I truly believe that the pioneers of the Internet, including Vint, had this vision of everyone for everyone to have access to the Internet regardless of where they're from.
And that's what made the Internet this powerful tool. And when and I believe that we should still preserve anonymity. And in our conversations, we need to focus on instead of looking at identification of users, I think that we should also look at holding law enforcement accountable. Sometimes law enforcement is a human rights violator.
And one of the things that I wanted to tell you briefly about NCUC and NCSG, what we have done, it's a small role, but we have actually worked with the Government Advisory Committee, to discuss how we can come up with authentication for law enforcement and also how we can, like, report on the requests for access to people's registration data, how we can bring that transparency so that we can actually provide some sort of accountability for law enforcement as well.
This is why the voice of noncommercial is so important, because they can bring these nuances that, yes, law enforcement is great, but some law enforcement agencies are actually intent to violate human rights.
Thank you. And I see Vint's hand is up. I was not trying to misframe anything, Vint, just to be clear.
>> VINT CERF: Look, it's Vint again. Look, there are lots of arguments on either side of this equation. Absolute anonymity, however, feels to me like overreach. I absolutely understand and agree with the points made that some people for some people, being able to speak and to be anonymous in that speech is necessary because without that, they are they face potential serious consequences.
However, I believe we still have to maintain that the veil of anonymity has to be pierceability. You'll notice where anonymous reports are accepted sometimes the party to whom that report is made needs to know whether or not the reporting party is legitimate or not, but is held to account to protect the anonymity of that reporting party.
You'll find reporters, for example, who even go to jail in order to protect their sources. But in order for them to know whether the source is legitimate, sometimes they have to know who that source is.
It's not a trivial matter, it's not an easy matter, and there are tensions in both directions. But I'm now finding myself leaning towards accountability, because in its absence, I don't know how to protect people from harm.
>> PEDRO de PERDIGAO LANA: Thanks for the direction. Amine, there is the question about Benjamin, about just let me get back to it, about the participation of noncommercial, there isn't enough space. Enough capacity by noncommercial in spaces.
But I would believe that this would be a question for on site speakers and it would be also be nice to collect some questions from the on site audience. Amine, back to you to speak on site.
>> AMINE HACHA: Thank you, Pedro. Before I come back to our speaker here, I will give I see a lot of colleagues interesting to add their valuable input. I will start by [?].
>> Thank you, Amine, thank you for having us as commercial with you. We are, of course, I'm from the International Trademark Association, and I liked what Farzaneh okay. I liked what to start with what Vint mentioned when he started talking.
He talked that he's talking about all the Internet users, and this is very important. And going back to Farzaneh when she mentioned that it is important to have the privacy, I agree with her. But of course I liked when she said we need a procedure to know who's behind it.
And as again Dr. Milad and Vint mentioned it again, sometimes total anonymous is not really good, because you cannot know if it's someone, let's say, criminal behind it. Is it related to disinformation as Dr. Milad mentioned.
So this is why I think it's important to cover it from many sides and especially when going to the ICANN who is, for example, it was it was good move to have, let's say, the private information to be hidden. But I don't think it's important to hide, for example, who is the company around responsible for this domain or something.
Anyway, what I'm trying to say is that it's a conversation for short needs a lot of talk, but it needs the different views from all the stakeholders to be sure what's needed for everyone.
Thank you.
>> AMINE HACHA: Thank you. Next, please.
>> Hi, good morning. Thank you. I'm Andrew, mainly in the tech community, but I'm also trustee of the Internet Watch Foundation, which is in Civil Society.
I think I completely agree Vint's point about being the piece of anonymity in appropriate circumstances. Dare I say absolute anonymity, absolute privacy takes us to the dark web and we only have to look there to see why we don't want to go there. That's not the right destination. But it seems to be the path that some of us have us set on if we're not careful.
But if I may pose just two brief questions. How can we avoid continuing the weaponization of privacy which in reality is mainly the privacy of adults when it's a qualified right overriding all of the human rights of children leading to an explosion in child sex abuse material, exploitation, et cetera, and how can we ensure that the data is for real owners which can be stored privately and accessed only by law enforcement. And we need significant reductions in abuse and so if we don't know who the who the owners of the domain name are, we do see significant abuse arising from it.
So I don't know if any of the panel would like to respond to either of those points. Thank you.
>> AMINE HACHA: Thank you.
>> PEDRO de PERDIGAO LANA: Amine, Bruna has her hand up.
>> AMINE HACHA: Okay.
>> BRUNA MARTINS dos SANTOS: Just very briefly, I think it's really dangerous to place the discussion about privacy as weaponization of privacy, especially when you have journalists getting arrested for exercising their, you know, profession, when you have Civil Society activists getting killed and murdered in a lot of parts of the world. When you have female journalists getting attacked on X and many other platforms. I come from Brazil and Brazil has been skyrocketing in terms of gender based attacks to female journalists, male politicians, queer politicians. And in that sense, I do see privacy as a core aspect of a safe Internet and as a core aspect of these discussions.
When we address privacy as a right, I'm not saying by any means that I want the criminals to have privacy or I'm not saying by any means that I want the criminals to, you know, use aliases on the Internet, I'm just saying a lot of users rely on this as a tool to exercise their jobs, feel safer online, or to simply work on the details of a protest against [?]. Because that's what happened in my country, right?
And rights to protest, rights to, you know, freedom of expressions. They both need to be aligned with the right to privacy or the needs to fight crime or the needs for law enforcement agencies to work. And it's not one side is not more important than the other. We need to advocate for all of it more broadly and not just for the law enforcement agencies to work. I need my rights to be able to go to a protest and feel safe or to be able to go anywhere and feel safe in that sense.
I just wanted to reply on that point because I honestly as a proper civil rights advocate I just like the ideal of privacy weaponization.
>> AMINE HACHA: Thank you. Next, please.
>> Hello. My name is Robert Carolina, I'm general counsel with Internet systems consortium. We're a root server evaluator. I don't think there's anything I can add to the conversation. I'm a senior teaching fellow in cybersecurity with University of London. I think there's been a great deal of ventilation about the tension between accountability and privacy, so I won't I won't pile on to that, as much as I would enjoy it.
I think instead I'd like to ask a question about Vint Cerf's other key challenge that he led with, which is agency. Specifically, I'm wondering to what extent we can or should focus more on protecting vulnerable populations. And I think I should be in this context I should be much more clear what I mean by vulnerable populations.
I'm not speaking here about people who are vulnerable by reason of their political or sexual or religious identity, I'm speaking here about people who have become vulnerable because of an ongoing diminishment of their ability to think clearly.
I'm thinking here about aging population or people who have come under some type of mental cloud because of health issues.
And what we've seen is an explosion of predation upon that type of vulnerable population through phishing campaigns and similar online crimes. And when thinking about that population, people who've had life savings stolen, who've had significant amounts of their retirement taken from them through criminal activity, I think it's extremely difficult to solve problems like that through the normal tools that we use to instill agency such as better education. I think that's a false horizon.
I'm wondering what else can be done to assist that group to reduce that kind of predation. And now I am going to be quite difficult by suggesting that whatever the answer is, I would suggest to you that the answer is not increased law enforcement corporation on a multinational basis.
And the reason I suggest that is because if we look at where the criminals are often located, asking the law enforcement officials of a, let's say, less developed economy whose primary job is to solve crimes that have local victims, I think it's unlikely that one could ever get the type of cooperation that would be effective at a mass level.
So if that's off the table, and perhaps you'll disagree, then what else is there that we can do to assist or to try to interdict that kind of predation on that kind of vulnerable community?
>> VINT CERF: On the presumption, that question might have been aimed at me, among others. Let me say that accountability does loom large in my mind, as you described that scenario.
We're unable to identify the source of the phishing attack, I don't quite know how we will deal with the bad behavior.
At Google, as you might know, we try very hard to filter as much as we can of the incoming email, if we can detect that it's a phishing attack. Or some other malicious content or maybe it's got malware as an attachment, for instance.
The only reason we can do that is that we have such a large population of users that we can see a lot of the phishing content flowing into the system and we can identify it and try to filter it out into the spam folders.
That's one kind of response, which is just to try to detect and remove it. But if you want to go after the parties who are generating these attacks, I don't know how you could that without some ability to pierce anonymity. And frankly trying to get cooperation from when which is what the Budapest cybercrime tried to facilitate. I don't disagree with your point that it may not be effective if the law enforcement component doesn't have the capacity to respond.
I don't really have a comfortable answer to your question, not a pointed response. But I think that we have to keep thinking in the back of our minds that accountability is our only real tool for dealing with bad behavior.
>> Thank you, Robert, I think you raised very important issues about vulnerability and how we protect vulnerable people.
In my point of view, definitely technology should empower more than marginalized. To hear if we would not go international, let's go local. Locally I think the major issue is to empower people and educate people so that they know what they are what they are reading, how they are acting. And the main the main I think action that we everybody should take and the specifically the policymakers is to collaborate and cooperate.
Collaborate not only with regulators. Regulators should cooperate with Google, Facebook, with every company or every organization that are engaged in these in these in the Internet Governance. We should collaborate also with NGOs to spread awareness.
I partnered with Twitter, with X, with Facebook, with every platform so that we ensure that we talk in a language that's acceptable. And the most important issue, we have to spread awareness and literacy based on the country that we are working in. Because different countries have different cultures, have different languages. So we have definitely to take into consideration the culture of the country.
I have to tell Facebook and to tell Google and to tell these platforms that we have our own culture, we have to protect our to protect our culture by certain also standards, not only your own standards. So this is very important, the cooperation and collaboration, and education comes on top.
Thank you.
>> I would like to tell you a story that happened to me in one of the advanced Arab countries. And it's not about only specific type of users. Some international hackers have been and are still able to have the digital identity of that country, which is usually used for the government to talk to citizens or to people with residence. And they contacted me and started taking they said they impersonated the police. They said we from this police department and we would like to double check your KYC and identity and so on.
And then at some point, they said what's your account number, bank account number. And I stopped there. And later on one week and I called the police and they said did they take your money? He said, it's okay, just forget it.
The next week one week after a friend of mine in another country in another city of the same country lost around $150,000 from her account because she did not stop at that. She thought that I mean, the government is asking her for some information to update her information. And she's a top manager level, but she's not into well into technology.
I think in addition to all that has been discussed, maybe at some point in time we need some kind of insurance or reinsurance on the data loss, because if you want just to rely on centralized decision or police or international collaboration might not be sufficient.
Thank you.
>> AMINE HACHA: Thank you, Dr. Milad. We have a few minutes before we close the session and I'm very interested to listen and for our speaker as conclusion.
I want to say something that time run, but we'll not let this discussion run from us. We'll continue and we'll work on this discussion. And this is the main reason why we think to make this session, because we don't want to focus about a problem we have as noncommercial users, more the session is focusing about the rule we make as noncommercial users.
And we are looking forward that this discussion to continue and to build on for continue the stability and security of the Internet.
I will make and start with Dr. Milad to conclude this and I want to thank everyone who are present and online your valuable input and support is very accountable for us. Thank you again.
>> Thank you, Amine, also for inviting me. I also want to emphasize as we deliberate on the future of the Internet Governance, we must privatize inclusivity. Particularly for vulnerable populations and the population also from country to country. This means bridging the digital divide, trying to enhance cybersecurity, try to promote digital literacy. This is very important, also literacy is subjective and ensuring that technology empowers rather than marginalizes. And I conclude also by emphasizing that safe and empowering Internet requires the active contribution and the active participation of everyone, including the noncommercial users, their voices and their actions are critical in championing an Internet that is secure, that is inclusive and works for good.
Thank you.
>> AMINE HACHA: Please, Bruna.
>> BRUNA MARTINS dos SANTOS: I think maybe I'll start with Vint's point in the chat about the balance between safety and privacy not being static. I do agree with that in general, because I think it's one of the things I was kind of referring to in the beginning, right? We do have different approaches to many of these issues we have discussed, whether it's encryption or, you know, any other need for anonymity tools, for anyone that feels threatened online to simply exist and perform their jobs.
And we also have business interests and governmental interests, and each and every single one of those it's initially legitimate, right? What's not legitimate in my mind is the abuse and whether it's the abuse coming from governmental requests for access to data or the abuse from law enforcement for further identification of users and so on.
And I think the main challenge within this and any other Internet Governance discussion is to find the common point, right? What's the point we can compromise with and that won't result in any violations to users' rights or violations to the way the Internet is supposed to work or to the core aspects of the Internet in that sense.
So maybe I'll close with that and just add in a plus one to your point, Vint.
>> AMINE HACHA: Thank you, Bruna. Okay. Turn to Vint, please.
>> VINT CERF: I've had more than my fair share of air time. I will say, though, this discussion has been very helpful to me because it sheds light on a variety of different situations that put this whole question of privacy and anonymity and accountability and agency into a kind of dynamic discourse.
I have the feeling that we will never end up with an absolute solution to this problem, that we will find ourselves oscillating around some point that mostly seems to protect people from harm, but sometimes ends up in abuses that we fail to prevent.
I hate saying that, but I'm afraid that it's probably realistic to recognise that we will not achieve perfection. What we want to do, of course, is to minimize harm in all of its potential guises in this environment, and there are a number of different tools that we can use to achieve that objective.
>> AMINE HACHA: Thank you again, Vint. Dr. Milad.
>> DR. MILAD: I would like to close by highlighting again the importance of having a balance between what's local and what's global in terms of policy and solutions. And here I would like to highlight three levels.
One is the infrastructure level in any country. All this discussion is highly affected by how advanced the infrastructure is and therefore what are the plans to develop this in order to provide equity and access to all kinds of people.
Second, the data. When we talk about privacy and the rights, digital rights and so on, what about the access of the data, where is the data hosted, for example, in many countries they have policies not to host the data outside the country. And this creates additional [?]
And the third level is the content which sometimes we disregard. Especially nowadays with AI available, tools that can create any kind of content. And in this sense, it's both ways. It's not only preventing people from from voicing out certain opinions, but also what people, especially underprivileged or even younger generations and so on can access from international available content. And here there is a need for balance between some kind of surveillance and and local cultural sensitivities.
So we need to take this into consideration, and therefore, as Vint just said, there is no one size fits all and there is a need for some local flavor into different types of policies that have lot of things in common with the global level.
Thank you.
>> AMINE HACHA: Thank you. Please, Farzaneh.
>> FARZANEH BADII: So I'm very chatty in chat, so I just believe that in order to bring safety as well as upholding those longstanding Internet values such as open, free Internet and keeping people safe, we need to discuss more about, like, who is accountables are we talking about?
Is there a way to like what sort of safety issues are we tackling that needs an actual identification and how do we actually go about that?
Are we going to act are we talking about getting asking people for their driver's license? Some people in some countries don't have driver's license. These are issues that we have to consider.
And also I don't think that, as I said in chat, I don't think that this kind of, like, balancing or tradeoffs take us to the, like, gets us closer to the solution. We have to be we have to say that, okay, so this is the policy that I have decided to come up with in a multi stakeholder environment or however in a democratic way. And these are the implications of this policy.
And I and then we will evaluate whether that policy's actually good for the for the public and for the global public. Not just somebody not just communities in certain countries.
So with that, I just end it with we need safety for everyone and we need Internet for everyone.
Thank you.
>> AMINE HACHA: Thank you, all. Pedro, you can conclude from your side, please.
>> PEDRO de PERDIGAO LANA: No, I mean, considering the time we have, I think you can conclude on site. Thanks. Thanks, everyone, for their presentations and very contributions to our discussion.
>> AMINE HACHA: Thank you. The session is over and we'll continue our effort together.
Thank you. Bye.
(Applause)