IGF 2024-Day 2-Workshop Room 5- WS137 Combating Illegal Content With a Multistakeholder Approach-- RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR:    Good morning from Riyadh, Saudi Arabia, and also welcome to our online participants.

My name is Auke Pals, I will moderate this session, Combatting Illegal Content with a Multistakeholder Approach.

I'm not doing that alone.    We do have great speakers here in the room today and online.    And that's for the greater good.    While this is a quite sensitive topic and it's a complex challenge in the Internet Governance Forum area because we're going to discuss today who decides what's allowed online and how to prevent censorship and how to ensure openness and freedom of the Internet, while regulating content online.

So first of all, I would like to welcome our speakers, so in the room today we do have arda gerkens from the authority of terrorism content and child pornography material.    Welcome.    Mozar Teragoni from the Brazil regulator.    Tim Scott from Ro blox.    Urbali from Meta.    And online is brine Cimbolic, a chief officer at

First of all, arda, can I ask you to have the floor and talk about Your Role?

>> ARDA GERKENS:    Yeah, Sorry About That.    We're the newest regulator and we execute the terrorist content online regulation from the European Commission and national regulation to combat child pornographic material indeed.    We do everything from detection of this kind of material to sending out removal orders to anything of giving fines or appeal of any of the hosting party or platform might do.    Basically, everything.

And because we're regulator I think we also look at the landscape as such because we can send out removal orders, as many as we want, but we actually don't want the material to be on there.    So we seek ways to diminish that material online.

>> AUKE PALS:    I think you do that by having good collaboration with platforms?

>> ARDA GERKENS:    I hope so.    (laughs.) No, indeed, we have regulatory talks with them to see how we can cooperate and also to know we do the right thing technically and also to debate with things whether things are possible or not.    I think we talk about this later, can we block on DNS level or geoblocking and stuff like that, and I have regular talks with the platforms to discuss with them what they are doing but I don't speak with the platforms in a counsel, in a group, because they are competitors so they will not speak their tongue when they are next to each other, but I talk to them one on one.

>> AUKE PALS:    But now they're in the room together but Natalie, what you are doing?

>> SPEAKER:    I work on the job safety team and my job is primarily to make sure our users feel safe online.    Our approach to safety is multipronged approach.    One is have the right policies in place, including policies for tackling illegal content, all those categories of content.    And have features that are really important that users must have to exercise control on the platform, whether it's reporting, whether it's tools like limit or restrict to customize experience on our platforms.    And the last, our team works a lot on partnerships.    When I started out at Meta a decade ago we had maybe one or two safety partners we worked with, in India, for example, but today globally we have a network of over 500 safety experts that we work with and we use this expertise to actually help inform a lot of the work that we do on the justice and safety side.

>> AUKE PALS:    Thanks very much.    Mozart, from the Brazilian point of view, how are you regulating contented online?

>> SPEAKER:    Sure.    First of all, thank you for the invitation.    We are the telecom regulator so we don't deal directly with content, but as there's no regulator for content in Brazil we kind of do what the courts order, tell us to do, regarding this issue.    So what we can do is when we receive a court order like it could be from Supreme Court or supreme Elector report or maybe a lower court sometimes, we tell the telecoms operators to take down those websites.    But we cannot do that on the DNS level, obviously, so they do it on an IP level on the networks.    So that's more or less how we are trying to deal with that.    We have also the NIC.BR in Brazil which are responsible for DNS and IT but they're not government or state owned or nothing like this.

So also, I am on counsel there, as well, so we also comply with court orders, if they go to the NIC.BR for the DNS level as well.    But mostly talking from the perspective, we tell with them to comply with the court order and take those websites off the grid.

>> AUKE PALS:    Very interested.    I'm very interested to hear others' opinions in the room about that.

Let's move to (microphone going in and out)...Roblox involving content online.

>> SPEAKER:    I'll use this.    Okay.    Very similar matter, we take a fairly Multistakeholder approach for this.    I think it goes to the heart of the point of this people, this session.    Safety and safety of our consumers, our users on the platform, is right at the heart of what we do and always has been from our inception 20 years ago because we approach this from the idea of partnerships and provide the tools and features for people to use and we adapt those policy as situations change and have these conversations with governments, et cetera.    Rather than top down and draw a line under it, we are constantly in that dialogue so we understand where the risks are coming from so we can continue to make it as safe as possible.

>> AUKE PALS:    Thank you very much.    Now let's move to Brian.    Brian, welcome, online.    Brian, from the point of the dot org registry, how are you involved in this?

>> BRIAN CIMBOLIC:    Yeah, hi, thank you very much.    Thank you for having me, particularly remotely.    I work for Public Interest Registry, with top level domains, as well as charity and donation.    Dealing with content at the DNS level is difficult and typically it's not the right place for it to be dealt with.    We have a pretty robust anti abuse programme that focuses primarily on technical harms, under the ICANN world it's DNA abuse like phishing, malware, botnets, things like that.

However, we recognise sometimes the scale of harms when you're dealing with online harms can be so great that if other actors aren't stepping in that we are prepared to step from.    So we have partnerships with, for example, the Internet Watch Foundation to deal with childhood sexual abuse online.    We've been working with them for five years so we have a process in place where we work downstream with registrars and registrants and hosting providers to try to get that content removed so we don't have to deal with the DNS.    If there's a piece of childhood sexual abuse material one image on a file sharing site we can't remove that particular piece of content.    The only thing we can do is suspend that entire domain name.    While that might make the domain name system unaccessible, it also renders all the other benign content inaccessible by the DNS.    So it's not that it's impossible but you have to be careful and deliberate when dealing with online abuses by the DNS.

>> AUKE PALS:    Thank you very much.    So we've heard some techniques on regulating and removing online content.    I would also encourage the room to participate in this discussion.    So not only this panel talking.    If you do have an intervention, stand up, move to one of the areas and we'll give you a mic.    We encourage all the interaction as well.    If there's any intervention right now from the audience, if you know otherwise, I would like to give the floor to our panel.

to give a little overview, also, on regulations online.    Because there are quite some regulations dealing with online content.    Like DNS, dot com, Terrorism Content Online regulation.    Who can tell me something about the regulation?    Who can I give the floor?

>> SPEAKER:    I can tell you the terrorist content online regulation is basically the regulation giving every competent authority in all of Europe and every country, there should be one, the authority to send out a removal order to any platform or service that is aiming at customers in Europe.    So basically that's very swiftly, could be any language, from the European Union or maybe if you pay your services with euros, that's already a European connection.    Once we give you the removal order the material needs to be taken down within one hour.    One hour takedown time, no questions asked.    After which you can debate that.    If you don't take it down in one hour we can fine you for that.    If you don't take it down at all we can even fine you more.

if you are, for instance, based in the Netherlands and we have some companies in the Netherlands which are really interesting, for instance, Reddit, and Snapchat, to name two, Discord, is based in the Netherlands and have legal representatives there.    If they have terrorist content twice in a year, then we will tell them we have special interest and we'll go into talks with them to talk about how to not have this material on their platform.    So that's the regulation.    I think it's quite obvious that child sex abuse material is illegal in itself.    Basically, we shouldn't even have to debate that.    It's just taken down.    Other regulations coming up is of course Digital Service Act.    We have also E Evidence coming up.    We have the Video    I don't know the abbreviation.

>> AUKE PALS:    Talking about DSA?    We'll give a little introduction of DSA?

>> SPEAKER:    Maybe it would be better for a European to talk about DSA but Brazil, I can say it's very inspiring for Brazil.    So we have some piece of legislation being discussed but we don't still have any bill approved by both Houses but what has been discussed in Congress now is pretty much inspired about the European experience.    So DSA, DMA, how we can deal with that, but nothing yet proves good enough for our Congressmen to approve that in a final stage.    So maybe when we have something that's going to be inspired, that is set by European legislation, but what can I say that I see in those bills proposed in Brazil is that we are trying to aim for undoubtful, harmful, or illegal content.    We're not giving a lot of room for us for the regulation, whoever they are, to have much of a judgement about that.    So only what is really clear that is illegal, that is harmful, is going to be removed a priori, I would say.

>> SPEAKER:    This is working now.    In the UK we have the Online Safety Act making its way through implementation at the moment.    The illegal harms codes have just been published.    I think there's 2,500 pages of information and guidance and that's not an exaggeration.    So they're going through a quite forensic approach to this, which I liken to a DDOS attack in terms of trying to keep up with it sometimes.    There's clear illegal harms but also content harmful to the children which is a far more grey area.

And a lot more open to interpretation in terms of how you react to it.

I guess the focus from our regulatory needs in the UK is about what is the evidence base but also what's the mitigating factors you're doing on your platforms to keep users safe from harm?    And that does give, again, to the opposite of what you're saying in Brazil, it gives quite a lot of scope to work with the regulators to say, look, these are the risks we see on our platforms and these are the measures we're taking and this is the dialogue we're entering into to demonstrate we're coming up to scratch on what you're expecting.    I think it's been a really collaborative experience thus far which hopefully should aim towards meeting a shared goal.    Colleagues and I have just come with meeting with the Digital Co operation Organisation.    Again, so that point, what is it you want to achieve as a regulator or government?    What do we want to achieve for our users and what's the shared ground?    How do we reach that?    I think rather than top down legislation, that's battle X, I don't mean the company, but let's ban X.    Let's have a dialogue about how we might achieve that shared aim.

>> AUKE PALS:    Dapali?

>> SPEAKER:    I thought I could give a little bit of an overview on content top down how companies like us deal with it because we have to do it on a global level and make sure the Internet is not splintered in any way.    In terms of a lot of on our policies, take down actually predates a lot of this legislation.    Including content that is illegal or might be harmful and not illegal.    Child protection policies go beyond sexual childhood abuse.    For example, we have an approach of removing this content and also have technology that proactively removes it a lot of times even before somebody reports it to us.    We have to deal with at a very local level is each country's legislative regime dealing with specific content.    So we have geoblocking policy.    And those geoblocking policy are designed essentially to deal with the laws of that particular country.    So as a company, we're dealing with, you know, with the laws of Brazil, for example, or the laws of India.    Relating to very specific content.

Now, in each country, there is a very slight, I would say slight, to broader gap, between our community standards and our geoblocking policies.    So I think the challenge really is for us to make sure there's a category of content that we don't allow on our platform, that we don't think should be on our platform, plus the additional content of what countries' regulations require, and to try and bridge the gap between those two by having dialogues with the regulator.    It's not necessarily that we agree all the time, but I think that these are really important discussions to be had because content that is illegal in one country will be free speech in another country.    And we've dealt with a lot of those situations.    The question for us is, what do we do with that piece of content?

>> AUKE PALS:    Arda?

>> ARDA GERKENS:    It's really interesting what you're saying because you're basically saying we do a lot and still we find Facebook challenging the removal orders for terrorist content because they said this actually didn't happen from the Netherlands but from another country, Facebook stated it's glorifying terrorristic actions which is not illegal for Facebook but it is illegal under European law.    So I can still see.    I mean, I find it sometimes difficult.    We have these discussions because you make, for me, quite clear content, border line content, because it's different in different jurisdictions.    And I do understand that sometimes that can be challenging if you look, for instance, in Europe, when it's white supremacy content, or whether it's geodistic content, that's quite clear to us.    If it's somebody from Catalonia speaking on something, then it might be viewed differently.    But to debate and say that "we do a lot." I think if you both, I'm not just looking at Meta or Roblox, but if the industry would do more we wouldn't have this conversation, we wouldn't need legislation.    We should say, this is where we draw the line.    For me it was interesting for Meta unless there was a rule from a authority then they need not to comply with the legislation.

>> AUKE PALS:    There's a gentlemen in the room as well.    We'll go over with a microphone.    Thank you.

>> AUDIENCE:    Hi.    Thank you.

I'm Andrew Campling with Internet Watch foundation interested in the removal of CSAM from the Internet.

Thank you to PIR for partnering with us, entirely voluntary efforts, which are making a big impact.

Platform operators, in my view, don't always comply with legislation, for example, switching on anti encryption which hides CSAM being on the platforms.    I'll pick on Meta because I know the numbers, Meta on Facebook, about 30 million instances of CSAM being found a year and then they switched on anti encryption on Facebook Messenger in which you can't see the content.    It's implausible that it's gone away, but it's a way of avoiding having to do the moderation.

So the terms of service that the platforms have are useful, but they're not actually enforced most of the time.    So for example, you don't have meaningful age verification and, you know, surprise, children tell lies about how old they are.    Again, I think our Meta    apologies to do this twice, but the report account shows billions of dollars of profit made from children who are too young to use the platform, and that's equally true of the other platforms that don't have appropriate age verification controls or even age estimation controls.

So to get to the point, regulation is absolutely necessary, but only if it's attached to significant consequences for the platforms and preferably for their senior execs.    The example I'll finish with, Telegram was the outlier in this space until their CEO was arrested.    They immediately joined the IWF and are now actually searching and removing CSAM from their channells.    That shows when the senior execs are in jeopardy they will comply.    Maybe that's where we need to start.

>> DEEPALI LIBERHAN:    In the transparency report we had for this quarter I think off the top of my head for CSAM particularly I think we would have removed more than 7 million pieces of content, for bullying and harassment also seven plus million pieces of content.    For injury, more than 5 million pieces of content, and that's just this content.    We actively report this content even before somebody reports it to us often.    CSAM we probably report the highest numbers.    I don't know what the number is this year, but I don't think that that suggests that we don't do anything.    I think that we do do a lot in terms of the content that we remove.    In terms of end to end encryption I think the suggestion here is that end to ends encryption means there's no safety.    We actually work to ensure there are safety in place with end to end encryption.    From the safety side of I think we had a prevent, detect, remove response for end to end encryption which says let's think about removing these interactions in the first place.    That's why we put really strict messaging restrictions on Meta and Instagram so a young person is not able to be messaged by an adult or a teen who are not connected, who they don't follow.

Also, end to end doesn't mean your content can't be reported.    It means it can be reported and we can take certain action.

Also, in terms of public services, we can also use our technology across the public services on end to end encrypted services and we do that as well and we reported content that we found.

if the conversation is end to end encryption bad?    Should we not have it?    I think that there are a lot of people, I don't know if there are any privacy advocates who would disagree that that's a bad thing.

>> AUKE PALS:    I'm curious do we have them in the room?    Do we have any in the room?

>> DEEPALI LIBERHAN:    But I think that's a different conversation and I think that's a debate we've had in many countries but I don't think in any country they've decided to have a regulation that bans end to end encryption because creating a back door for one means for everybody.    So I think there's a lot of instances we need to take into consideration but that's a separate and important discussion.

Just because a service is end to end doesn't mean we can't have safety mitigations and we can't have safety content.    We can do a lot.

>> AUKE PALS:    Thank you very much.    I also put a question on the screen.

So I would encourage everyone in the room to participate and log in on the website Menti.com, code 773 6669.    To answer the question, in what way can we shape collaboration between regulators and industry without regulators losing their independencesy?

We have some regulators in the room.    While we collect responses, I'm also curious on the perspective of the regulator.    Who can I give the floor?

>> SPEAKER:    I think you can talk a little bit as a regulator.

But I would also like to share the experience in Brazil.    As I said we don't have a regulator for the DNS or IP's but we have a Multistakeholder approach on that and the regulator is on the board.    We can interact with them and come to reasonable solutions together.    In Brazil, the regulator, as in the UK, we have panels that include Civil Society, consumers, Private Sector, other government branches, et cetera.

So they can constantly give feedback about what we're doing and finally they can even challenge out in court, if they think it's appropriate.

So that's a good kind of checks and balances to involve all society, all the sectors and still keeping the regulator independent.    That's something I would like to share.

>> ARDA GERKENS:    I think I do find this very challenging because indeed as a regulator you should keep your eye on the ball.    Right?    So what we want to do in my organisation is make sure that in the end, the Internet is cleansed of terrorist content and child sex abuse material online.

and but we also realize we cannot do this without collaboration with any party involved in this.    That's because you will always have bad actors and always people on purpose put this material out there.    So it's a fight we need to fight together.    But I do find it difficult, first of all, there's a lot of debate in my country, the possibility of geoblocking or doing anything on DNS level.

So you have to have a debate on that.

and the other thing is that in the end if I talk to the platforms, I mean, let's be honest, you're big companies.    Right?    You're in here for profit, of course, and with a nice tool maybe but you're in here to make profit so it's going to be also challenging not to be too much left by the information given by the companies.    But if you look at the debate on end to end encryption, if you're not very well informed by either stakeholder on the Internet, then you might say things like, breaking end to end encryption, which I think is really not good for the safety of children, also, online, so yeah.    It's very hard to find a balance.    But I do think it's very much needed as a regulator to do this closely together.    And so yeah, I would really be interested to see what the public has to say, what they see as a challenge for us, as a regulator, but also maybe chances for us.

>> AUKE PALS:    Yes, indeed.    We collected some responses.    I'm quite curious about the clarity of responsibilities within the ecosystem response.    So we'll get that answer in the room.

No one in the room?    Maybe online?    Any shy people?

>> SPEAKER:    Thank you.    We're working on a public private partnership on content moderation.    This is what we really need.    We need clarity of responsibilities because it's a complex ecosystem with lots of responsibilities.    We need to talk about where responsibilities lie and how to actually check on these responsibilities, so they work out the way they should.

>> AUKE PALS:    Anyone want to say further on that?    Otherwise, I'll take another one.

>> AUDIENCE:    Hello, everyone.    (?)    From the Brazilian association of Risk  providers.    Is this working?

>> AUKE PALS:    It is working.

>> AUDIENCE:    I want to reiterate and give clarity in the case of Brazil as well.    We've been dealing with a lot of issues.    The regulators sometimes ask to do things that's outside of its control and sometimes the ISDs have to do some DNS blocking or block services like X and sometimes they do not have the capacity to do that in the speedy manner which the regulator would like.    So we have this back and forth.    So it's always a challenge.    And clarity of responsibilities I think would be the most important thing.

>> AUKE PALS:    Thank you very much.

Go ahead, Deepali.

>> DEEPALI LIBERHAN:    I wanted to say I think it's really important for companies like us to have a continuing dialogue with regulators, especially as they're in the process of legislating, or more importantly passing through.    For example, I know our teams have done a number of deep dives with the Afghan teams?    We've met them a lot and consulted on a lot of rules they're passing.    I think those dialogues are important because companies like us can talk about the work we're already doing and also understand the intention of what those rules are and ways we can get to the substance, we can get to the substance of the matter.

So I think that sort of dialogue is really important.    And I think that that should definitely be    that should definitely be encouraged.

>> AUKE PALS:    Thank you.

>> AUDIENCE:    Just to add a little bit what was just said, what Deepali just said, I agree with her.    I think it's really important to the industry.    But sometimes other sectors of society see that as a loss, some kind of loss or threat for the independencesy of the regulator.

But I agree with her

But to add to what Ian said about X and maybe bring Brian into this as well.    The X issue in Brazil was a very interesting one because the court order came and it was about to take down for a certain time, until they complied to the laws in Brazil, about DNS, about the name of DNS that was outside of the Brazilian jurisdiction, outside of Brazilian sovereignty.    It was not a dot BR domain.    So we had to go through the ISP's and telecom operators networks to comply with this court order.    So it's a little bit trickier.    It's harder to do.    We cannot do that within our sovereignty.    So I believe we should be talking internationally about that.

And maybe establishing some agreements that we should share at least with court orders known by each other and things like that.    I know Brian can talk about dot org.    We talk talk about dot BR.    We call it CCTLD, country code top level domain.    But if it's another country code, we cannot do much in Brazil.    So I believe we have really to discuss that internationally like we're doing here now.

>> AUKE PALS:    Yeah.    Thank you.    Brian, do you think that content moderation should be a joint effort?

>> BRIAN CIMBOLIC:    Yeah, I think I come at things from the sort of voluntary practice model more than the regulatory model.    Again, we're sort of different than a Meta.    We don't have content.    We're infrastructure.    We point via the DNS.    That doesn't mean there's not a role for us in that.    So the gentleman from the Internet watch foundation mentioned a sponsorship or partnership, really, and I just put in the chat, that PIR put in place that any registry be they CCTLD like dot BR or GTLD like dot org can take advantage of.    And that is that we're a not for profit.    PIR is a charity which makes us slightly different from most GTLD operators.    So we're sponsoring this as our nonprofit mission so that any registery operator can work with the Internet watch foundation if child sexual material is found.    They can also take programs to block or prevent the registration of domain names that are known to be used for CSAM in other TLD's.

So I think there's a lot of room for sort of voluntary practices between legislators and regulators and otherwises known as trusted notifiers, the term we use in the DNS space where you work with an expert organisation like an Internet watch foundation because we as registeries don't have the expertise or the tools or even the content to go out and look for some find CSAM.    In fact, it's illegal for us to do so.    So it's pretty essential that we work with organisations like the Internet watch foundation or other trusted notifiers, like the Red Cross for identifying fraudulent fund raising sites.

So working across industry and with NGO's, there's a lot of room for improvement across platforms and across registers and registrars and that's something we're actively exploring.

>> AUKE PALS:    And that's also shared resources.    I've seen the response from the audience on shared resources.    Materials and databases.    Who has given that answer?

>> AUDIENCE:    Basically, I was speaking on answering this question.    I can imagine that technical resource that you can share in a safe way is maybe one of the easiest things to set up while keeping independencey of your regulators.    So that's why I answered.

>> AUKE PALS:    Yeah, no, no, that's clear.    Arda?

>> ARDA GERKENS:    I think that we have a problem here.    It's also I see more remarks on sharing of information between public/private partnerships.    I do know for instance that for the platforms it's difficult to accept hash databases coming from a governmental organisation because that would basically indirectly mean that a government is telling them what to host and not to host on their platforms.    Right?    So, on the other hand, I do think we need to solve this problem.    Because in the coming years we will build up a lot of information or databases on either child sex abuse material or terrorist content.    I would very much be interested in having the databases that you're having, because I'm pretty sure you have some, and I think you would be very interested in our databases, too.    So how can we solve this problem?

>> DEEPALI LIBERHAN:    I think that, you know, we are required as a U.S. company to report, which is where these considerations come from.    And we've actually worked with MICMIC in the past to try to address issues, not this one specifically, so I'm just broaching that as an idea of something that maybe we can talk to NICMIC about, but one of the things we have done, we report to them and NICMIC works with law enforcement across the globe.    We've worked with countries to have a downloader tool that helps to download this content in a very safe way because when you're sending across these reports it's important they're done in a secure and private way.    We also work with the Tech Coalition which is an alliance of tech companies to go beyond just the content.    A tech coalition is an organisation where participating companies can share signals about accounts and behaviour.

So this goes beyond content because we know that predators don't stay on one platform.    Right?    I'm talking specifically about CSAM here.    They move from a platform to platform.    So what Project Lantern does is enables us to share those signals and participating companies can receive them and do investigations.    In the pilot phase of this we received a number of links of Mega which violates their policies    there are ways available to address CSAM and go beyond that and address signals and behaviours as well and be able to address this issue at a holistic level but in terms of specifically to your point I don't know if you've had any discussions with NICMIC but we're happy to  

>> ARDA GERKENS:    I also think this won't solve the problem.    If we want to work together, you cannot just receive information from governmental organisations with the question whether you want to take that content down.    I do acknowledge the Project Lantern is a perfect project that helps identify perpetrators from sending this material on your platforms.    It's very good you talked amongst each other.    I'm very happy you're doing that.    But still really this will hamper us from stopping information spreading rapidly online whilst we really want to stomp that.    I think if we look at what we're doing in terrorist content where we have the incident response, well, after the shooting, where the content was disseminated online rapidly, it would be great if we could do that also for childhood sex abuse material.    If we look at the figures, that's something you know very well, a lot of these images is duplicates and not unique.    And then going viral.    So we would like to stop that.    As long as we don't share this information among each other, it's not very helpful for the kids.

>> AUKE PALS:    A remark from the room.

>> AUDIENCE:    Andrew again, two brief comments.    End to end encryption, an easy solution that doesn't break encryption or privacy, the image files could be scanned to see if they contained CSAM before sending.    Doesn't break encryption or privacy.    That would immediately impact on the state of the problem.    So quantify we're looking at roughly three new victims of CSAM is second, over 300 million a year, which is a scary number.    Then just a second very brief comment.    It's worth bearing in mind that the tech industry as a whole is actively changing some of the underlying Internet standards.    Again, arguablely because of privacy.    To bypass many of the existing protections.    So it will be quite hard in the future to stop actors, for example, from these type of sites, because to know that you have effective parental controls, some of the changes will mean that parental controls don't work anymore.

So I think that's an area where maybe regulators need to challenge the behaviour or some of the tech companies and really question whether their motivation really helps because it will break even the distinct but weak enforcement mechanisms.    Thank you.

>> AUKE PALS:    Also, someone next to you.    You did have an opinion.    I saw that.    No?    No, please.

>> AUDIENCE:    No.

>> AUKE PALS:    Okay.

>> AUDIENCE:    Thanks.    Golam May from IBM.    I am still chewing on the question a little bit.    I would have understood in what way can we have collaboration in order for it to be successful, but losing the independence, sometimes regulators claim they cannot collaborate because they use the independencey and they say you just do what we tell you to do, and the industry says, okay, we'll do exactly that, at the latest possible moment.

So I think, second, too, we need clarity on responsibilities, I think it's very important also that we need clarity on the ultimate purpose of the collaboration

I think also sometimes it goes wrong.    The regulator wants you to do what they tell you, and the industry just wants to spend as little money as the effort as possible, and to make as much profit as possible.

I think there's a very strong interdependency.    Regulators can never be successful if they don't collaborate with the industry because they will have laws but they will not prevent what they forbid and in the end industry is very dependent of regulators because they need a licence to operate.    I think for quite some time they will think that they don't need that but in the end they will.    So, yeah, that's my two cents on this.

>> AUKE PALS:    Thank you very much, Gulomb.    In the room, as well?    State your name?

>> AUDIENCE:    ...hernandez from Mexico.    I wanted to share with you, part of our duties, as industry regulators, is to be aware of practices.    From industry, it's very common, you can try right now, in one of our providers of domains, trying to buy a domain with the word "child sex" and they are available.    There are no limits.    Just pay.    So I think this could be a good beginning, just an example, of what we need to do as industry for creating balances or good practices and in some countries regarding cookies and some countries like Brazil and Chile are now developing in their privacy laws to create guidelines and good practices not to give the assumptions to customers.    This is one of the back doors that a lady has mentioned.    They are open all over for creating these areas of opportunity to illegal content, to upload illegal content.    That's my comment.

>> AUKE PALS:    Thank you.    Brian, do you want to reflect on that?

>> BRIAN CIMBOLIC:    Yeah, actually, I was about to raise my hand to do so.    That's one of the programs that we have in place.    It doesn't, for example, take child sexual something, like work with that, and block that term.    What it does is the IWF has identified domain names that have been registered in TLD's that have been dedicated to child sexual abuse material.    Let's just, for sake of an example, let's say it's bad CSAM domain dot something.    The registry suspends that one, and another TLD pops up, and that domain has hopped.    Once that domain hopped twice it goes on a list and now any registry can receive that list from the IWF under this sponsorship from us so that it can prevent registration of that term in their TLD.    It really helps to sort of protect the TLD from being abused and also helps to disrupted these commercial brands.    Unfortunately, there's sort of brand recognition with known pedallers of CSAM.    So if one is suspended and it hops to another domain similar, then the consumers of that recognise that brand so it doesn't solve CSAM online but it helps introduce friction and make it harder for the bad guys to continue their brand online

But coming back to the earlier point, why don't you just block anything that says CSAM or whatever?    We've actually explored in other sort of discussions with regulators around opioids and narcotics online.    An interesting thing we ran into is they wanted to block known terms for opioids or narcotics and we were interested in that, and then they provided us the list and it included things like lemonade, anything with the word lemonade they wanted to be banned online.    And not recognizing that there might be some really legitimate uses of the word lemonade.

So it's something that there's to the kind of core of the question that there has to be a sort of good faith feedback loop between industry and regulators to ensure that whatever we want to give regulators the tools that they need to have good and educated regulation, and so having that feedback mechanism is key because you don't want to inadvertently block for example any registration with the word lemonade in it.    There's lots of generic terms that are street names for drugs and that's bad but those street names also have legitimate uses online and you don't want to inadvertently hamper speech online.

>> AUDIENCE:    I would just like to add two little pieces of information about those issues.

in Brazil we do have a regulation to avoid kind of names in domain.    And but recently, it doesn't always work.    So it has its flaws.    Recently, in a judgement in the Supreme Court in Brazil, the justice asked if he could go online and register the domain name as death to democracy dot com dot BR.    He shouldn't be able to do that, but some guy saw that because it was on TV and he did that and he registered that domain.    And within our regulation we could suspend his domain in a few hours but not everything always works.    So without court order in this case.    And just to add about this, independencey thing about the regulator, I would like to say that one very recent piece of legislation that's been proposed in Brazil was talking about self regulation of the industry.    And that the industry could create an entity to self regulate itself and that entity would be oversaw, overseen, by the regulator.    So there's a state regulator.    So it's probably possibly kind of a midway measure to deal with this lack of possibilities of the regulator to predict everything so the industry could be a little bit more comfortable.

>> AUKE PALS:    Thank you very much.    I put a new question on the screen so you can log on again at meant Menti.com and answer the question.    And while you do that, I will give the floor to Brian to answer this question.

>> BRIAN CIMBOLIC:    Yeah.    Thank you very much.    So the question, what role can technical and infrastructure actors play in combating illegal material online.    So I sort of already covered the drawbacks or technical impediments dealing with illegal material online with the DNS.    You think of Craig's List in the U.S. it's based on a dot org.    In a lot of other countries, it's not.    But is somebody uploads CSAM to Craig's List, is the right move to expand Craig's List altogether?    And suspend all the other millions using it?    I think the answer is no.    But what's the process to notify downstream?    I do want to draw the distinction, if you have a site dedicated to something like CSAM or threat to human safety which we have come across, to me there's no issue.    Registry should step in right away and suspend that.    These principles are codified in a document came out in 2019 called the framework to address abuse.

This was a document originally was 11 registers and rej stars that signed on.    Now it's more than 50 different registries and registrars.    Basically, it has two principles.    The first is that registries must step in, registries and registrars, must step in when based with phishing, malware, botnets, et cetera, et cetera, with technical abuse

Then it accepts the premise that generally speaking content is best dealt with at the level at which it can be directly removed, the hosting, the provider, the platform, et cetera, because of issues of collateral illness.

but in CSAM, human abuse and trafficking, and opioids online, the regulators should step in and do something in the response.    Then the burden shifts and I think it did become appropriate then for the registry or registrar to step in and do something to disrupt that content.    Again, it doesn't always mean suspending the domain name because of issues of collateral damage but to try and do something.

>> AUKE PALS:    Thank you any reflections?

>> ARDA GERKENS:    I think it's very interesting.    If you look at the terrorist content online regulation, we can only send out removal orders to the party that on behalf of another actor has published the information.    So basically, that's 99% of the time that's platforms or social media or    and one% of the time it might be a hosting company for a website.    Whilst at the child sexual abuse material legislation we can actually go up in the line.    So the problem that we encounter in the Netherlands is that we have some hosters who apparently host a lot of this material and they basically always say, yeah, there's nothing we can do about it because we have a managed hosting, we don't know who our customer is, we cannot reach them, so there's no way we can act.

So this is something we're hopefully the legislation which is quite new so I don't know how effective it will be, but hopefully we will be able to somewhere down that line say, well, you know, whether you know your customer or not should not be our problem.    That's your problem.    You need to do something.    If you cannot find your customer I'm going to go to you because you are the one that is making    giving the responsibility to customer to host this kind of material

But it is something we are still struggling with and also seeking collaboration with the infrastructural parties to see what way we can be effective without just taking down the whole of the Internet or, I don't know, 30 other websites, while you just want to take that one image.    For us, it's difficult.    And I'm really seeking co operation here.

>> AUKE PALS:    So I'm also curious about the questions that have been coming from the audience.    So I'm curious, who made the comment about the Clinton administration approach?    Is that online?    No?    No one?    Or shy people?

Then let's move on

Then I'm curious about what you've been writing now.

>> AUDIENCE:    If you must know I was texting somebody.    I wasn't writing down anything on that.

(Laughter)

but from the university of...in the UK

Much of what I wanted to say has been covered here but I wanted to go back to the earlier point.    I had a little thing to add on, on what regulators can do.    We did a study, a pan European study, a couple of years ago for the European Commission where we looked at the state of play for age assurance for technologies in terms of how it helps children and we found that the ground reality was there was hardly anything out there.    One of the reasons for that was industry really didn't know what standards to use really and regulators provided very little guidance and that came up quite a lot

One of the recommendations we made was, this is going back two years ago, in the UK, that regulators need to help industry with issuing guidance, standards, so that they can comply with the laws.    It's not enough for legislation to stipulate that this needs to be done, but they actually need a bit of help.

I'm not talking about big companies like Meta who have the resources but also smaller companies who have obligations to comply with not just media services directives but also say online goods that are inappropriate for children, thank you.

>> ARDA GERKENS:    That's really a very good point.    At least we know for instance for    there are several initiatives out there.    You were talking about the technology coalition which is a coalition of platforms to combat child sexual abuse online.    There's also check against terrorism and (?)    Which are both organisations where they help platforms on terrorist content.    And certainly for the smaller companies this legislation is really bad.    I mean so many legislation coming up to them.    I really agree with you, what we do in our organisation is to see what other initiatives are out there and help give them guidance where we can find it.    Already a lot of work has been done, but it needs to be implemented.    Very good point here.

>> AUKE PALS:    Thank you.    Also from the back of the room?

>> AUDIENCE:    Can you hear me?    I'm David Macaulay.    Like Brian, I worked for Internet registry named Verisign.    I want to answer your question, although I'm not on Menti right now.    A couple of things I think technical companies can do is to reach out to two target groups.    One is government regulators to talk to them, to try and speak with them about what we do, how esoteric it is, all the implications of taking action and what that means, and vice versa, hear the same concerns from the government, on terrorism, especially, things like that.    The other is to reach out to groups like this.    I think participating in a session like this answers your first question.    How can we share information and collaborate without regulators losing their independence?    So I would encourage organisations that have sessions like this to do more like this.    I took part in one that had to do with DNS abuse about a month ago and it was an eye opening session where Internet registrars spoke about the implications of taking sweeping actions and the need to be sort of circumspect in what orders might come.    Very informative stuff.    I want to thank you for the session and that's my way of doing it, trying to answer two questions at once.    Thanks.

>> AUKE PALS:    Thank you very much

Any other comments from the room?

If not, what does industry think of the remark that was just made?

>> SPEAKER:    I haven't said anything on the minute, last comment exactly in the UK, I worked in games industry for about 20 years in the UK government and then in the trade association now for Roblox.

Some of the initiatives we've been involved with have been that Multistakeholder approach, working with the trade bodies and then working with the individual companies, in a forum to share best practice, to tackle problems, to look if there's a wide label solution particularly in the area of CSAM

Or CSEA

And it's been incredibly effective.    The problem is maintaining that emphasis as people come and go in roles.    You will have someone incredibly proactive within, say, a government organisation who is promoted or retires or goes somewhere else.    And they kind of take that knowledge and that enthusiasm and willingness to collaborate with them out of the building.    It's quite different for industry to establish those government relations in a really meaningful way.

So I echo the sentiment of the gentleman in the corner that absolutely this forum, and IGF is a great way of doing that, we have people from around the world here, talking about shared issues

The other thing I would say is they're truly shared issues.    I think we've almost got the wrong people in the room if you want to tackle some of these problems because we do care.    Right?    And if our platform becomes synonymous with problems, people won't be on our problems.    It won't be economically viable for us to run it.    There are other people who are less concerned with those sort of things and trying to get at those people is the real challenge.

>> AUKE PALS:    Arda?

>> ARDA GERKENS:    I think you do yourself short here.    But the fact that you're here and acting, means they need to comply more.    If nobody would be here in the room complying, the job would be much harder.    For me as a regulator to be able to stipulate, if I would ever have to go to a judge, that there's policies out there which are common, and within the industry, that means that, you know, that really helps me to fight that battle.

So I'm really happy that we're here in this room together

Also, I think the discussion on DNS abuse, to me, that's really    that's a new sound that we have.    For many, many years we didn't want to interfere on the infrastructural level.    We always said, no, no, that's content; we cannot touch content and we can't mess up infrastructure.

So I'm really happy we're having discussion now and looking at the possibilities here because we need to do more than just looking at platforms, too.

>> SPEAKER:    I would add more information.    If what Timea just said, maybe part of the answer for your question is we would have more stability in public service staff, something like this.

>> SPEAKER:    I mean when somebody leaves they don't take the proactive approach with them.    That's part of the role.

>> SPEAKER:    Part of the answer for that, is the first question, the independencesy.    Then we don't change if the governments change.    If a regulator is truly independent, their staff will go on, even though the presidents or prime minister has changed.    So we can keep on doing our job in a regular way and through time.    So independencesy of the regulator is part of this answer and it's really important for us.

You all mentioned we do care.    In France, the owner of telegram was arrested.    And is that a way to deal with nonoperative parties?

>> DEEPALI LIBERHAN:    (Microphone going in and out)...working on the safety team and working on safety organisations and there's been combating of a lot of harmful content online.    Like Project Lantern, which is run by an alliance of tech companies, but there's also others run by a help line and by NICMIC which allows companies to remove information from them and they're able to remove that content if it is uploaded.    Safety organisations that we work with also help us design certain interventions and communication that may be helpful for users as resources.    So I think they're definitely a very important part of the ecosystem.    And for the Telegram enforcement    should I hand it to you?

>> SPEAKER:    Just to say something really quick that question also carries something that clearly co operation isn't even.    That's why we need legislation.    We need to put everybody in a bar that's the lowest acceptable level of co operation.    We have some companies that are very cooperative and others that are not.    That's one of the reasons we really need regulation.

>> SPEAKER:    Just to finish on that point, the risk of doing that, you lower the bar.

If you codify it and put it on the minimum standard, people will say, that's all I need to do, and just put it in that box and move on.    We don't get the innovation.    I would not happily change places with you, although Brazil is probably nicer weather.

>> SPEAKER:    Exactly.    That's why it's not a straight answer.    It's a very tricky thing to do.    So we have to do it the best way possible.    You're right.

>> ARDA GERKENS:    The interesting thing you said is Telegram was not complying.    They were already complying with terrorist content online regulation.    They were very compliant for our organisation and others too in Europe but they really just don't do enough because I can still see channells with names, terrorists, shooting, or other names which clearly point out the type of material probably be found in those chat rooms.    Even if they are arrested, they may take a little step.    I don't know if that, in the end, will solve the problem

Again, here, I do think it would be nice if the industry themselves could also put more pressure on Telegram to be compliant because I sometimes feel that there's, like, we have the good actors here on the table and the bad actors are not on the table.    I don't agree.    I think you're one group and you should also talk amongst each other and if you have bad apples in the room you could help to get them out.

>> AUDIENCE:    Gulov again.    My answer is the bottom right one.    I think infrastructure actors can do technical and infrastructure actors can do a lot but all those things cost money.    So we are a not for profit.    So it's relatively easy, although it's just five costs, but I think for a commercial company to spend really significant amount of money on combating crime, online or offline, by the way, it has to be part of the strategy, the culture, the company values.    And I think that is very often the problem

It is seen as a cost

And as soon as you arrest the CEO, suddenly, these are justifiable costs because spending them you will prevent the second arrest of the CEO.

So our biggest challenge is fighting abuse should be part of the culture or the values of also these very large companies.

>> AUDIENCE:    I completely agree with all of that.    I think the point of arresting the CEO or other senior execs is there has to be meaningful consequences for noncompliance and increasingly for some of the big players the level of fines exercised are just not meaningful.    Even Apple's fine of £12 billion, of a 3 trillion dollar company is not material.    Whereas, arresting the CEO focuses minds and does a drive action.

and then briefly, the other point, yes, the risk of having legislative action to enforce compliance might then, you know, lower the bar for some.    But the regulation doesn't have to be static.    There's no reason why you can't raise the bar every year so you do keep pushing certain companies to do more, try harder.    I'm sure regulators will be entirely up for doing that.

and having a general duty of care, for example, absolutely raises the bar because you're wide open then to challenge, if you can show that there are widespread abuses irrespective of whatever measures you've taken.

>> AUKE PALS:    Thank you.

>> SPEAKER:    We can keep trading this all day, Andrew, because that creates barriers to the market, and then stifles innovation, which causes problems.    This will go back and forth, but I take your point.

>> DEEPALI LIBERHAN:    The only question I have, a lot of countries    a lot of political speech is also illegal.    And I think we also have to be aware of that.    A lot of companies including us have pressures from government to remove content which we would not consider illegal but we would consider fair political discourse.

>> AUKE PALS:    Comments from the room?

>> AUDIENCE:    Hello again.    Not speaking personally or my organisation but trying to be a little bit of devil's advocate here.    Going back to the point of clear responsibilities, aren't we putting an oversized responsibility on the platforms to solve an issue which as a society we haven't been able to solve in such a long time?    Are we doing enough as a society to solve the problem at its roots and not necessarily at the television, words, spreading itself, maybe we're putting too much on the platforms and not doing enough?    Because the discourse it's been us versus them push and forth, and perhaps not enough collaboration and Multistakeholder everybody sitting at the table and trying to find the root causes?

>> AUKE PALS:    Thank you.    I did see in the room, as well, comments on public/private partnerships?    Anybody have a comment about that?

>> ARDA GERKENS:    I totally agree with you.    I do believe that many times, or the problems that we have, people are looking at all the end lines.    How can we look into encrypted environments to stop the spreading of child sexual abuse material?    Well, maybe we should go back and address why there's even this material out there?    Because many of it doesn't even come from abuse but it's even voluntarily leaked images or AI generated or whatever.    And still a form of abuse.    So I think there should be much more attention in policies for the beginning and see how we can solve this as a society.    But I also think that what we really need is more co operation also.    I'm looking at the regulators, for instance.    My organisation itself is part of the global online safety regulators network which now has eight regulators.    It's very interesting to see that some of the platforms which are not here on the table today, but basically think that we don't talk to each other.    So they tell one organisation this, and then we found out that they do different.    So that helps.    But I think we need to do that more.    And also, I'm worried about    you're talking about the height of the fines.    We can alter that.    You know?    We can make ten, 20, 30%, whatever.    These are the bigger companies you're talking about.    Big tech.

Brought we see a lot of problem was the smaller platforms, like Gap, Team, you name them, smaller platforms that really don't care that much about regulation.    For us it's going to be difficult to either collect those fines, if we can impose them.    We're going to need to cooperate within the countries to make sure that if I cannot collect the fine, I can go to your country and you can help me collect that one.

So the Internet is global.    So the solution needs to be global.

and so the cooperation needs to be global to even be able to tackle the problems that we have.

>> AUKE PALS:    Thank you.    Meanwhile, you want to answer, I put a new question on the screen.    Please all grab your phones and join Menti.com and use the code.    Deepali?

>> DEEPALI LIBERHAN:    The point you made, I think it's really, really important.    When we make it, I don't think it's taken    I think it's really important to have that conversation, and one of the examples I will give is that, as a platform, issuing threats is something we remove as violation of community standards and I think multiple platforms do that.    But we still see that people think it's normal discourse to issue threats online.    And what can we do as a society and less as a platform?    What can we do to address that?    Is that the role of schools, educators, parents, as well as platforms?    I think that's a conversation that's not always had

The second is we report millions of pieces of content to NICMIC that goes to law enforcement authorities.    We also don't have the ability into what are the prosecutions taking place based on the reports that we're giving.    I think that's also really important because it's the last part of the chain to understand, are the reports we're giving useful?    Or are we just going in cycles where we're removing the content but at the end of the day you're not able to take criminal action against actual predators.

>> AUKE PALS:    Thank you.    Let's move on

In the last 15 minutes, I want to answer the question, how do we prevent legislation that threatens the open and free Internet whilst addressing illegal and harmful content?    I do see some people actively typing in the room as well.    I'll just give someone the floor who I see is not actively typing.    Sir, can I give you the floor?

>> AUDIENCE:    That's my comment, the third one.    I think we need to preserve privacy and so on but at the same time we need clear procedures to be able to know who put harmful contents.    So we need to be able to reach the channells or whatever.    So I think this is needed.

>> AUKE PALS:    Thank you.    Gentleman next to you, did you also put down?    No?    Okay.

Did you?

and any reflections on the comment just made?    Or on the question?

>> SPEAKER:    Just to inform also this last piece of legislation in Brazil, that privacy on anonymous stuff online, it proposes that people can keep their identity, their real identity to the public, but the platforms should know who that person is.

if necessary for any court order or something, the platform will know who each profile on their platforms really are.

So it kind of puts the possibility on the digital platforms.    I don't know if that's the best way to do it, but it's one way that's been proposed in Brazil and we'll have to discuss it.

>> AUKE PALS:    Thank you.    We'll do someone else.    I saw actively typing.    Gulov.

>> AUDIENCE:    Gulov Meijer again.    I feel strongly about it.    I have two answers there.    One is by realizing we share responsibility and act on it.    I feel that's the summary of it all.    Just talking from my industry it's been far too long that the domain industry can say, no, no, we can't do anything because we'll break the Internet and we'll get financial claims.    It will be slippery slope because now it's illegal and then it will be unwanted and then unpleasant and then political and so on and so forth.    There are all kinds of excuses to do nothing.    I think for fear or something.    And slowly we're overcoming this position.    Arda is talking about things that I think it's the first step because it's still saying we don't do content, but we can do something if the DNS is being abused.    I think it's kind of artificial distinction on online venues, but anyway, that's the most important thing, that we feel we share a responsibility.    And of course, also, regulators realize we are not the responsible party.    But we have a responsibility.    And again, like in my previous reaction, that responsibility, that feeling, should stem from company failures.

>> AUKE PALS:    Thank you.    Brian, I saw you actively nodding.

>> BRIAN CIMBOLIC:    Yeah.    I agree 100% with Rolof that, you know, I think there has to be the recognition that while the registries and registrars shouldn't be the natural place to first approach issues relating to website content, there is a role for us to play.

Particularly when you're talking about not file sharing sites, but sites that are dedicated to a specific purpose, whether that's child sexual abuse material, whether that's stolen credit cards, whether that's, you name it

There are instances in which there are clear just patently illegal issues and that there are times where it's appropriate for registries and registrars to step in and do something so I agree with Rolof entirely.

>> ARDA GERKENS:    I think the way DNS abuse took this task on is very interesting because they established a framework in which they said, if it falls under these and these categories, that's something that we should address then in that moment.    Of course, this is for us a very important question as we have    as we were assessing terrorist content online.    We have regulation.    How do you explain that regulation?    It's always for us a day to day job to evaluate.    But I think we have universal human rights.    For me, if it hampers one of these universal human rights then I think that's something that can be illegal and hurtful content.    For me, anybody can say what they want, freedom of speech is okay, but as long as you're not interfering with these human rights then it's fine by me but once you do I think that's something we should act upon

If you don't    that's something we tend to forget.    We think we cannot interfere with anything online unless it's really illegal content because then it will hamper the open and free Internet and free speech but we're now in a world online that especially women but I think many of us don't even dare to speak or be themselves anymore because if they do, that might have serious consequences, not only online but also offline as we've seen.    For instance, as we've said, women, LGBTQ+, we know women don't go into politics for that reason, rape threat, we all see them.    That's hampering freedom of speech and that should end.    We should have a world online where we should all discuss whatever we want without those threats.

>> AUKE PALS:    Thank you.    I saw hands raised.

>> AUDIENCE:    Just a small comment.    I put it on the screen.    Which is to turn this question around.    Oftentimes, there's pushback to do certain things because it will break the free and open Internet.    I observe, though, some companies, I'm not sure if this is true of any in the room, do precisely that in order to get market access into certain countries.    So they absolutely change their products, change their approach, about things to comply with exactly the same rules they're complaining about, but to get access to certain autocratic countries, but refuse to do that in democracies.    It's almost as though democracies are being punished but on the other hand they will happily comply with exactly the same things in autocratic places

That's all to just call all where there's hypocrisy like that, if you can do that in that country, do it here as well, please.

>> AUKE PALS:    Thank you very much.    Anyone who wants to comment on that?    In the last five minutes of the session?    I saw you.    No?    Okay.    My sound is off

Wrap up.    Oh.    Yeah.

>> AUDIENCE:    Thank you.    I'll be very quick.    The question made me think and took me back to the 1990s when we started to talking about how to regulate the Internet or not regulate it at all.    It was just school of thought that argued that Internet should be left alone without laws.    But we've moved far from that.    And in fact there are laws that address illegal and harmful content.    The solution is not more legislation.    Essentially what we need is more effective enforcement of the laws, like regulators and other laws that have not been enforced, rather than thinking about more legislation.

>> AUKE PALS:    Thank you very much.

>> AUDIENCE:    No, I don't want to delay more, in fact, because I'm sorry, it's a very important session, sorry, I couldn't stay with you from the beginning, start with you from the beginning, but because DNS abuse was mentioned, the next, by the way, workshop is on room two if you would like to continue at 3:15 about DNS abuse defined and experienced.    If anyone wants to join us we would be happy to do that.

>> AUKE PALS:    Thank you very much.    As we reached the pitch part of the next session I would like to give room to anyone who would like to reject or reflect on the session.

>> DEEPALI LIBERHAN:    It's important to continue to have these conversations and we are glad to be invited and I think this is exactly the kind of information sharing.    And also the sharing, even if we disagree.    I think it's really important.    Thank you.

>> ARDA GERKENS:    I just wanted to add to what you said, indeed, we have had all regulation for    legislation for a very long time.    But I think it's very hard for law enforcement to enforce that.    And I do believe that we regulators can play a very good role because for us it's also easy to talk to the companies, the platforms, the infrastructural platforms, or companies, and we need to do so, to get our regulations right.

>> SPEAKER:    As a final remark, I would like to say I'm very pleased about what happened in this time because we start to get very technical about DNS and stuff, CCLD, et cetera, and we ended with I believe the straightforward answer that this question is democracy, democratic values, democratic process, protests, if you think something has not been doing accordingly.    Society has agreed to participate on that all.    Thanks God we're in democratic countries, free that we can talk like this and come to good and meaningful results.    So that's what I believe.    That's what I think must happen in Brazil.    And I'm very glad to participate on this.

>> SPEAKER:    To echo sentiments on the panel in terms of from a Private Sector perspective, talking to regulators, talking to policymakers, and having them talk to us, means we can understand each other better.

It doesn't have to be adversarial and we don't have to have conflicting and competing aims.    In many cases, certainly in my experience and my career on both sides of the fence, that is rarely the case about understanding each other.    And talking to each other is the way we can achieve that better level of regulation.    This is to be well continued.

>> BRIAN CIMBOLIC:    Just thank you very much for accommodating me being remote and I think there's room for responsible and thoughtful action at the infrastructure level but again it has to be carefully tailored to avoid collateral damage.

>> AUKE PALS:    Thank you very much.    And thank you the rapporteur.

>> RAPPORTEUR:    There is a simple conclusion.    We need the dialogue and share the responsibility.    And thank you for organizing this session.    Thank you all stretch.

>> AUKE PALS:    With this, I would also like to thank all the panellists and all the audience for your active participation, and the online participants and the remote moderator Dorijn.    We would like to continue this discussion and that's what we're hoping for is to continue this discussion locally and also at the next IGF in Norway.

So if we do get our session submitted and accepted, hopefully see you there.    Thank you stretch.