IGF 2017 - Day 2 - Room XXV - WS107 Out of my Hands?

 

The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> CATHERINE GARCIA VAN HOOGSTRATEN:  Good afternoon.  We would like to start this workshop by thanking the great support from the Netherlands Internet Governance Forum in organizing this multi‑stakeholder workshop.  As moderator of the workshop, allow me to introduce myself and shortly present the subject matter of this workshop.  And approach that we are going to take here to foster the discussions with you all, on site and online.

I'm Catherine Garcia van Hoogstraten, at the Hague University of Applied Sciences.  This is a follow‑up of our last year's workshop on the cryptic extortion we call it.

And three main takeaways from our IGF 2016 workshop were:  First of all, that if we want to think about minimizing and controlling the impact of the scale of online harassment, based on nonconsensual distribution of sexual images, it is of crucial importance that we need to think of it as a social tech problem.

Thus as something that's the human behavior, the technical interfaces and social context.  Secondly the current architecture of the Internet and the social media enable increased forms of exposure.  This is leads to scalability and replicability and searchability of material.

Lastly we concluded there is an iniquity within the online harassment taxonomy.  It mass given rights to extortion.  So extortion is blackmailing of an adult or a child, with the help of self‑generated images sometimes of person in order to extort sexual favors, money or other benefits from him or her, and under the threat of the material with the person.

The expression extortion commonly used in public discourse may lead to some ambiguous and sometime paradoxical understanding of the crime affecting children.

The use of this term implies equivalence with the crime affecting adults and mainly to failure to more complex and natures the crime‑affected children and, of course, it ‑‑ it is great consequences for them.

As it does not clearly show that it's a matter of sexual exploitation against the risk and trivializing a practice that can produce extremely serious consequences.  In this, IGF 2017 workshop, we aim to explore key emerging exponential technology and use inserter action, beyond being deployed by multi‑stakeholders and identify the challenges and opportunities and the implications for Internet Governance.

Bearing in mind that the use of center action lands a good practice model for examining technology, through a user perspective and so we want to draw attention on the need of developing technology ecosystem that addresses extortion, taking into account the user knowledge and human factors.  And, of course, in a way that challenging determinus.  So the methodology we will do for this workshop to engage an interactive discussion.  We will have an overview of ‑‑ this video is prepared by the European crime center at Europol.  Following this video some remarks explaining the key importance of this media campaign will launch by Europol will follow, and then we have three identified block of questions to sparkle the conversation between participants and invited discussions online and onsite and to react to these questions and keep up their expert input.

And so after each question, we will give the floor to the discussions to introduce themselves and react to the questions.  We will ask them, participants to sing out their convergence by agreeance, right, by raising their right hands and to signal the divergence by raising their left hand about each of the questions after the discussion and participation.

So we will ask you in the discussions to justify these view points of both you online and onsite and our online moderate will be taking up questions from the online audience to follow the discussion and they are tweeting, probably you have seen the following #ws107 and NLIGF.

We will wrap up and Arda will take up on that.  And future scenarios.  So as a reminder for you all, the goal of this session is not necessarily to reach consensus on the nature of the problem or potential mitigations but rather to elucidate a variety of frank points of view and perceptions on what the Internet, multi‑stakeholders ought to be doing in response.

So it's very much hands on.  And now we will follow, please, with the video.

(Video).

>> GREGORY MOUNIER:  My name is Greg Mounier, I'm the head of outreach at Europol and we do do a lot of cybercrime prevention and this is one of the examples we have done in 2017.  So the story behind this short movie is that our child sexual online operational team has got a lot more report from victims between 2014 and 2015.  It increased by 150%.  The reports of child being abused from a victim of extortion and questions through social media as in these diverse scenarios.

So the 28 Member States of the European Union investigators decided to look more into more details into this new cyber‑enabled crime, and so we launched a study.  We surveyed about 30 different child sexual exploitation experts and we reviewed a number of cases and relied on data Celt sent to us by the US NCMEC, which reports on sexual abuse online.  And then we came up with a number of findings.

And then in order to push those findings and to raise awareness with the public, we also decided to launch prevention and awareness campaign and we thought the most efficient was to do a nice movie that was really targeting the youngsters.  So this is just a trailer of 40 seconds but if you go on our website, you have the full version which is ten minutes.

And basically we have two scenario which correspond to the two types of cases we see most of the time.  We have two motivations for the perpetrators to try to court children to ‑‑ to give sexual images.  The first one is indeed to have access to more sexually explicit concept and so they trap and lure kids into providing more, but the other ‑‑ and that's a new phenomenon for us was that you have a number of organized groups have jumped into the market to generate revenue as well and that's something fairly new.  We haven't seen that in the last three years before.

And so you have these really two different types of scenarios and then these generate different types of perpetrators, different types of victims, et cetera, et cetera.

If I go into a little bit more into detail, an interesting fact is that female children for 85% are more likely to be the victims of offenders that are sexually interested in children; whereas, for the male, it's more ‑‑ they are more victim of organized crime group that are more interested to make revenue.  So that's an interesting fact for us.  And that gives us some indications in terms of investigative methods.

In terms of the problem, of course of this cyber‑enabled crime is you also have the problem with victimization, of course.  This generates even more child sexual predator images online and then they are being reposted.  We think it's vastly under reported.  Why?  Simply because of the embarrassment for the kids and the teenager of the types of images they are providing, and there's also the aspect of lack of awareness from the victims that they are a victim of a crime.

The crime is not to post sexually explicit images online.  The crime is to use that to extort some money and to try to lure them into a relationship, say into a relationship which is often the case.  So we thought that really engaging in preventions and doing this movie and reaching out to the youngsters was one of the main tactics to decrease the crime.  We also established a website with a number of sim I am tips for the kids to protect themselves online, how to set the privacy settings when you use social media because in all social media, you can change your privacy settings so that you reduce the likelihood of being approached by someone who shouldn't.  We also put some information on how to make requests to the platforms and the content providers to remove sexually explicit images because that source is something that would help the victims to put a list on the talk if they know it's possible to remove that type of content.  So we have a per platform Instagram, and the precise form on how to do it.

And one the findings as well of the report is that the private sector does have a huge role to play in the prevention of these types of crimes by making the security and reporting procedures simpler, more user friendly as well.

We had a number of cases where victims were willing to talk or to do something, but it was just difficult how they could reach out to the platforms.  Interestingly enough in the US, 33% of these crimes are being reported directly by the Internet service providers.  This is probably because in the US, you have' low ‑‑ you oblige the Internet service providers to report this type of content.  That's not everywhere the case but that's an interesting fact as well.

I don't know if you want me to go on, I can explain a bit more.

>> CATHERINE GARCIA VAN HOOGSTRATEN: We can save it for another discussion.

Can you go next?

>> So we will start with the three blocks of questions that we have prepared for today.  The first one concerns to artificial intelligence deployment.  I remind you that we will have first the discussions, the reactions each of them to the question and then followed by the on site participants' questions or remarks.

And then obviously the whole convergence, and expression and online participants that will be taken up by our online moderator.

So the first, just to give some background, to this team of AI deployment, there's some attempts in the ecosystem, as far as we have researched, the ecosystem to tackle with extortion.  Some of them new, some others building upon content ID, from interesting mechanisms such as the ones to filter too intellectual property bridges.

Some years ago, industry share platform.  Yesterday I heard comments about the house sharing platforms.  Which is cloud‑based and first collaborative industry initiative to improve and accelerate the identification and the report of child abuse, against different networks, resulted in more or less around 90,000 hashes that were shared.

Lately, we determined that due to scalability, the Internet services have automation.  In 2017, in fact, we have observed the emergence of Facebook photo recognition software to prevent the spread of images that have already been reported. and taken down.

So new artificial intelligence photo posting technologies are used to prevent that same image from being posted on platforms like messenger or Instagram and these will be, of course, elaborated by the Facebook global representative that we have here today at the table.  So now, the question that arise is can AI and also ‑‑ I would like to change because we have some discussions that comes from the ISOC block change team or even encryption‑based technology, can they enable an effective response to extortion?  So now we proceed in order.

Do you want to start?

>> KATRINA NAIN:  Am I audible?  Thank you so much, Catherine for having us as part of this.  My name is Katrina Nain, I manage global safety programs in Facebook.  I started out in country, working in India.  One of my first project was going into schools.  I spoke to 30,000 students in my first few months just to understand how they use technology and what their experiences, you know, we build it thinking about something, bust how are people using it and consuming it.

I'm really glad to be here to talk about how we are thinking about extortion and how we are playing a role in giving people the tools and the controls that they need online.

I want to take a step back and talk about how we think about safety.  I will be very quick, I promise.  We take a five‑point approach to safety on our platform.

I'm an Indian and a speak really fast.  We have policies in place that very clearly define what people can or cannot share on ow platform.  To give us an example, extortion does violate our community standards.  It is against our policies to cause someone to send elicit images, money, favors, in exchange ‑‑ under the threat of sharing intimate images and that violates our community standards.  There are a lot of challenges which I can go into.

Second pillar is making sure people have the controls to control their experience on Facebook.  What does this mean.

For instance, for minors, we have some extra steps in place on the platform.  The search for minors is based on their location, based on their date of birth, based on, you know, other private information.  So that disrupts ‑‑ our goal is to disrupt grooming and they cannot message a minor unless they are friending withs with them.

So our goal is to try to stop it at that statement.

But the new technologies that we were trying out, is what we call internally nonconsensual sharing, and some people call it revenge porn.  We don't like to use this terminology at Facebook, because of many reasons.  What we announced was if someone shares an image on Facebook, Instagram or messenger and you report it to us, you tell us that it's an intimate image shared without your consent.  We are not only take down that image but we will add it to a data bank and use photomatic technology to thwart the further sharing of those images and so no one can share those images across our platform again we started some efforts to see if we could take this a step further.  So even before someone uploads our image on our platforms can you come to us and let us know so we can work with you to, you know, use the photomatic technologies and stop the sharing of these images across our platforms.  So this is a much, much bigger, you know ‑‑ it's going to be much harder to try to get done because there are a lot of different elements at play, but this is something that we are committed to trying to figure out because no one should be able to do this on our platforms and this causes.  So distress among the people who are going through this experience.  So if we can figure this out, it will make a lot of difference.

>> I talked about pillars, and I talked about policy and tools.  The third pillar is help and support.  It's super hard.  Most people don't go and report these images to us because they don't know that that option exists.  One the things we have been trying to do is figure out how we can raise awareness about reporting on ow platform.  How can we make reporting more intuitive.  We have been doing some research.  So we have partnered with Thorn and you may have seen this research, which they published which was done by David Frankle Shore.

It talks about what happens behind the scenes and it gave us a lot of good material to work with.  They published this fantastic PSA.  If you haven't seen, it I totally recommend seeing it.  And what it tries to do is destigmatize the issue.  And tell teens if someone is troubling you and blackmailing you.  You should reach out, you should speak to an adult who can help you out or report it to the platform.  If you haven't seen this PSA.  We so it.

It's a really cute cat video.  They have used the cat, you know to deliver the message.  So I have talked about policies, tools, help.

And the last two pillars that we use at Facebook are partnerships and feedback.  Partnerships are super key for us to get this right.  It underlays all the other work that I talked about.  We are not experts on many of these issues and we need to work with the true experts to find out what's going on, which is why we worked on that research and why we are here to talk to everyone to find out, what are you hearing and what could we be doing more?  And this is very key to all the work we do at Facebook.

I will stop here.  I talked about a lot and I talked really fast.

>> SEMANUR KARAMAN: I'm Semanur.  I work at Gender and Tech At Tactical Tech Collective.  I think my immediate response is, no, not open its own because the reason that extortion exists is a number of societal factors including patriarchy and you can never address such a tremendous problem by only using techno fixes.  Just to provide an example, I was very happy to listen to the very detailed explanation, but I spent the last couple of months working with politically active women who are survivors of online violence.

And they have very little trust online platforms.  So they ‑‑ a lot of them view online platforms as profit‑oriented databases that benefit from their data.  I'm always welcoming attempts to collectively think about it and invest in resources to come up with solutions and I think the recent attempt by Facebook to address revenge porn, I think with the government of Australia, is a good attempt but we need to be more vigilant.  I'm from Turkey and I work with a lot of women from Turkey who you are survivors of extortion.  You can't convince many of them from uploading their nudes to prevent this problem.  So it's very broad, unfortunately, I don't have enough time to get into this but all I'm trying to say is we need an intersectional approach that takes into approach a lot of different factors to holistically provide solutions to survivors that come from a broad range of geographies and backgrounds without sort of further violating their data privacy.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Walid.

>> WALID AL SAGAF: I'm here on the Internet Societies blockchain special interest group.  Before I move on, do you know what a blockchain is?  This is typically the first thing that some people assume.  So okay.

>> Does everybody know?

>> WALID AL SAGAF: Not many hands up.  So let me just briefly explain in maybe a minute.  Blockchain is the technology that's been invented by bitcoin.  Bitcoin was the first crypto currency, that is tradable peer to peer.  So it has no central bank, and it has no central entity controlling it.  Money simply, the value, anything of value, including money can flow from a person to person, through a very strict cryptographic mechanism that ensures 100% reliability and anti‑counterfeit of any form, or sort, so it's fully vetted system that ensures that nothing can be altered or defrauded in this bitcoin artist.  So it allows that to happen.  So just like many can be securely, let's say, scored and encrypted in such a mechanism, you can do anything for anything of value.  Here I come back to the question, if blockchain technology can be used for to say, have an effective response to extortion, the question here is a bit tricky because it can not be directly done.  But if you look into the characteristics of blockchain that can be helpful in tracking, for example, the victims as well as the perpetrators of these crimes, on a blockchain enabled system, you can always track the identity.

And this is why it's used very often in the supply chain.  You can track every single thing from the very beginning to go to the very end, and this is why it's extremely secure in the sense that nothing gets hidden and nothing gets deleted.  In that sense, if you are talking about tracking the possibility, tracking that trails of that particular sequence of actions on a particular social network enabled blockchain enabled social network, then blockchain can fit perfectly because that's where no one can actually forge certain data.  It's all protected in this system.

However, on the nip side, blockchains and particularly what we call public blockchains such as bitcoin can allow anonymity.  Anonymous, meaning you do not necessarily have to give your identities in order to use this blockchain and so that leads to the possibility of getting away with crime.  So you see, there's a flip with it, based on what you are dealing with.

If you heard of the ransomware attacks, much of it led to people sending bitcoins to the perpetrators to ‑‑ to have it re‑enabled.  And so it meant it was a double edged sword.  Like anything, you can use it for good or bad.  Depending on the characteristics that you are looking for on the blockchain, then you can probably customize it towards having something possible in this regard.  So I hope that makes sense, and I will be happy to discuss further.

>> CLAUDIO LUCENA: Thank you, Claudio Lucena.  I would like to thank you again, Catherine and your organizers for having me on the panel.  I congratulate you on your efforts of cooperation with Europe always ‑‑ in this case ‑‑ every time it's important in our case but this, in a particular tragedy to use the term I used in Guadalajara, it's particularly important.  I'm not a criminal lawyer.  I do not work with crime.  I do work, though, with law enforcement and AI issues.  So that's maybe the reason why I'm here today.

I'm glad to see certainly the organization of the panel.  I remember that in Guadalajara last year, we saw a very promising research, ongoing research by Nicholas Bereta and I think it would be good to go back to the results if Nico has moved away with the research as we can add the final report here.  Because I do think as I told you yesterday that this should be ‑‑ must be an ongoing debate, particularly in the IGF due to the mosh open multi‑stakeholder production.

I'm glad to see that what we mentioned, the problem was framed in Guadalajara and I remember that the idea of automated recognition for content was there at that time.  We mentioned it specifically, content ID.  To me, it's particularly interesting how fast and how easy or how streamlined are the processes of content algorithms when the economic interests are at stake and is this particularly true for intellectual property?

I understand that patterns and standards of intellectual property are way more objective than what we are talking about when we are referring to intimate images.  Now content ID was a initiative, and we have a policy through Article 13.  The reform in the European Union as it is the stage now, many people are ‑‑ they should ask them for more information, basically Article 13 enables and stimulates platforms, or demands the platform for content recognition automation.  So this is continuously being deployed for the same economic reasons and once again I do recognize that a song or a work of art, it is way less sensitive than what we are talking about here.  But the deployment of this technology is way easier for this objectives than for the much more serious problems that we are talking about.

One of the criticisms for the adoption of this Article 13 content automation recognition is the fear that doesn't achieve a popular balance which is a legitimate fear and a legitimate concern, but once again, we are talking every time at this workshop, aft this table, we are talking about something that is way more sensitive and the evidence I have to bring you is that from our perspective in Brazil, Marco Civil Law has termed an interesting way to remove content, except one of the only exceptions in the Brazillian market which is a reasonable standpoint on this site is exactly images of this intimate nature.

So for this, you do not need platforms themselves when reported, must take the content down.  So we have a framework that really could work here.  I do believe that content recognition is a tool to be used here.  Once you mentioned both our fellow from Facebook and our other fellow mentioned trust as a problem, I think there's a perfect fit when they talk about blockchain because if anything ‑‑ I can't think of anything more clear in a blockchain than the revolution in trust.  If it is people or institutions or place forms for any business, it's not a matter of personal trust.  It's a matter of human trust.  If there is a problem of trust, blockchain is the technology that closest gets us to a nice trust solution.

So we could think this is really food for thought.  There's no more elaboration in that, but when Facebook suggests uploading pictures, we have to recognize the sensitivity and the difficulty of this, because culturally and personally, it's very hard to think of this as a solution, but I would like to underline that there is a technical argument behind there.  I don't know if it's exactly the solution, if we could hash, if we could extract ‑‑ if there could being an extract.  Material, that did not represent the material itself.  There's an underlying argument.  So it's the same principle applied to a way more controversial thing.  So that's the ‑‑ I will stop here so we can continue the debate.  Thank you very much, Catherine.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Yes, thank you.

>> ARDA GERKENS: My name is Arda Gerkens, I'm a manager of the dutch hot line and help line.  There's a huge problem here.  I don't think we have to elaborate on that.  And I think yes, we have technical solutions to regain the pictures.  We have technical solutions and I must say that we're very happy with the step that Facebook took to at least start reuploading the flags by users.  This a system that works with a child sexual abuse material.  You have been hesitating to do it, because it's not child sexual abuse material but in the end when it's flagged and taken down, it's not very useful to be reuploaded again.  So that's a good step.

And I can also understand the next step to say you can give me your pictures and I will just prevent you from being uploaded which is a good step but I can also understand, even more that's the first thing that I would hear people say, oh, yes, I'm not going to give my pictures to Facebook!  No way!  And especially what has happening now in the Netherlands, its something that we call exposure, happening to a lot of mainly girls from the Islamic community, girls being spread across the Internet and explicit text down there.  They will never want to share that.  I have absolutely agree with you.  I have think we should move this to the next phase.

We all know we have the techniques.  We need a neutral body who is ‑‑ you know, who you can trust, who has not ‑‑ maybe they want some money from you but not in a commercial perspective.  Maybe from the university base, maybe from police base, or maybe all together multi‑stakeholder at work here but we need to do something and this is what I really want to cry out for.

We spoke about this last year.  The technique is there and exactly what you are saying, of course for intellectual property, it's all out there.  Everybody does it.  If somebody happened, it doesn't matter.  Because the money is there.

And what we are struggling with is money and to unit and do something.  My question with you is how can we get it to be established.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Thank you to you all.  Now we have the second block of questions and it concerns to the definition as we dress in the introduction to this panel.  So sexual coercion and extortion, and it's usually as we can introduce these recent dynamics.  There is obviously a rising trend that consists of offenders targeting as we can see also on the Europol video, young people online through the social media platforms and the data sites and so on.

In order to obtain the material for coercion or extortion, the video or photograph of the young person has been obtained, the extortion/coercion begins.

Grooming, it takes several months to and the influence turns into a rapid escalation once they have been persuaded to send the first sexual images of him or herself.

So here, it is not clear still, what was the matter.  So it has not yet been addressed by several technical communities and even the academics studying online harassment.

Lately, the Council of Europe Working Group on the convention has reported in 2016, that the phrase sexual coercion and extortion includes the taking of money and property but also extends to other procurements.

So here the question that arises then if we are using, and discussing about blockchain or automation or any other sort of technology.  For instance, if we focus on machine learning tools that are being at the moment used by Facebook, so they have to understand what is extortion about.  That's the second question to discuss.  Maybe you can start.

>> KARUNA NAIN: Before I dive in, I want to talk about how do we take this to the next stem and why we did the pilot with Australia.  I know I won't cut across cultures.  That's why we need stronger partners on the ground.  We work with investments and the cyber civil rights we worked with an organization in Canada called YWCA Canada.  It does not translate locally.  We have to take baby steps to figure out how do we get this done.  We need to address it and how can we even stop it before it's shared on our platform, because that causes too much distress to people.

In terms of why not just have, you know, a process in mace where people can just upload an image or a hash gets generated and we use the hash to stop the further sharing.  People could misuse that to censor some speech.  How do we know that it really does violate our NCII policies.  It's super and super complex.  You know many other discussions we have been having have really tried to ‑‑ we have been trying to work with this information to see how do we get that stopping of that first sharing and that's what I will call it.

In terms of definitions, I will talk about practically what one of the big challenges is for us as platforms, when it's shared on our platform.  Often more ‑‑ most often these perpetrators don't just commit this act on one platform.  They start a conversation here and they take it there and people who are review certain content lack those threads or the additional context of what's going on and that's where, one of the big challenges lies for us.  Even if you have the definition that extortion, and caution and asking for financial remuneration and it's asking for elicit photos and it's asking for what may be, you know, how do we connect the threads at the back because we don't have the full picture.  So we do have to rely on people to come out and report and to tell us that, you know, this is happening and how do we build again trust in people to report, to go oust and reach out if not to us to law enforcement to their friends and family.  How do we destigmatize and get that conversation going on in communities is a huge, huge point of focus for us right now.  I will stop there.

>> CATHERINE GARCIA VAN HOOGSTRATEN: We have not given the floor to the on site participants.  If anyone has any reaction to the first or the second question, please raise your hands, whether agreeance or disagreeance and if you have any extra point to make, please.  We have one.

Two.  Three.  So will go into that order.

>> PARTICIPANT: I'm with the digital opportunities foundation in Germany.  I think answering directly to the question whether we need a clear definition as to what this extortion content is, I think it's always necessary to consider the context of the content and contextualization of the content is very important.  You can have a perfectly innocent image that is put into a sexualizing context and then it might be used for extortion.  So I think that's very important to take into mind and then I think machine learning tools can do a lot of support for that, but then you need to have those ‑‑ you have the context and the image.

>> CATHERINE GARCIA VAN HOOGSTRATEN: The second question, please.

Yeah.

>> PARTICIPANT: I'm also from Brazil and like you, I work with women who have gone through exposure online.  I have some different experience.  We can talk about it later.  One of the things I wanted to say is I do believe machine learning tools and blockchain, certain technologies can help prevent these sorts of situations but one of the things that always gets me thinking is what about the possibility of print screening something?  This is always a possibility because Facebook, you can flag it on Facebook, take it down and then somebody printed it already and it's on Twitter and you have no control over Twitter, or it can be ‑‑ anyway.  Twitter is usually very used for, that especially with the quickness of hashtags that one can use.

So is there a possibility of creating sharing?  You mentioned a cloud sharing mechanism that was mentioned somewhere here ‑‑ this here.  I have wanted to here more about it.  If it's possible to create these mechanisms that would ‑‑ I don't know ‑‑ like make the information disappear at one point, or not be printed.  Of course you could always, you know, tape if with a video camera.  There's always things people can do but we can make it harder for them to do it.  Thank you.

>> CATHERINE GARCIA VAN HOOGSTRATEN: And the last question or reaction.

>> Yes, I'm Natalie from Hong Kong and I'm like a youth that is I am a studying, using Facebook and social media.  Like the users, or even victims, and regarding to the first question, this is really and insightful.  But I got two concerns with baking good use of those technology.  Firstly, your policy, assumes all the posts are crimes but somehow there's a loophole, that how if they are not crimes and just, like, misconception or just normal criticism between people?  And I don't see how we can make sure that it was not a sexual issue that we can really address the problem and it's threats and barriers due to us or to users.  How can we strike a balance of the freedom of speech and the expressions through the protection to different victims as everything is, like, crucial.

I feel another privacy issue that stirred up through my mind, is that the stuff from Facebook that it can be confirmed for those sensitive pictures or information and how the quantity and the data who can assess will be controversial issues and how we can make sure that everyone is cool with that and the victims are able and failing to seek help from us.

Thank you.

>> CATHERINE GARCIA VAN HOOGSTRATEN: No more reactions or questions?  We have the last one.

>> PARTICIPANT: Hi, I want to respond to some things that were said about automated blocking and Article 13 of the copyright reform was mentioned and also for copyright issues.  Here I think there are problems with automated blocking and I want to warn about it, can a machine really get the context of content with copyright it could be.

Oddity, or other things.  A machine can't really get maybe.

Here the context was mentioned, maybe it's innocent picture in some context and in other contents it's extortion.  Yeah.  So there needs to be a human element in there or if the platforms are able to do the content ID things, smaller platforms would have a hard time to also have these content I. it.

Systems or if it's mandatory, or I think consider and the censorship argument with all the blocking is also something to think about.  Thank you.

>> I would like to go into that, the technique of hashing and preventing to be uploaded is in the child sexual abuse material.  Discussion is there how can I make sure that this is a picture that is illegal.  The child sexual abuse, the INHOPE system, they are being analyzed by the rules that are baseline.  This is what we call CCEVE materials.  And there's different legislation in other countries and something that just also addresses in some countries, what is child erotica is illegal and other countries it is not.  But this is, you know, the fine tuning, we have the majority of material.  We can't just get it off the system.  We could use techniques and in some countries they can.  They just go where the hashes go out at the Internet and they scrape whatever pictures are out there and they get them offline.  I think you should realize when you make such a system, in this case, it's not about child sexual abuse.  Oh, we still have half an hour.

This ‑‑ these images, this is a person who says, listen, this is my picture and my video and I don't want it to be online.  And so it's just like intellectual property, please take it down.  It's my property.  You also have the discussion, you know, what if it's a parody, and what if it's this or that.  I'm also a politician, for me, that would go, because I'm a public person but for most of you who are here, you're not supposed to use it if you don't want it.  I think what you would need is an independent body who looks at those questions like can you please make sure that this picture is not being uploaded to make sure that you have the legal rights to do that.

So yes, I think it should be really independent, what everybody trusts but we are all here with multi‑stakeholder questions and you need to think, that the pictures if something will go wrong at some point in time.  It always will, but to strike the balance, it's really quite important.  This is really a big issue.  People are burdening themselves over pictures.  The last thing I wanted to say, which is a really difficult situation.  What if the picture is not illegal but within the context it is harming or formful.  This is what exposure is about.  Some girls have a normal picture and this is a phone number.  Call her.  She wants to do I don't know how many guys and with the Islamic culture, you know, your whole credibility is gone.

And it's ‑‑ I don't think we can tackle it with any system, but, yeah, we need to realize that this is also another escape that we need to tackle.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Do you want to react?

>> RENATA AQUINO: I'm in blockchain as well but I also work in the frontier of blockchain and gender.  I'm in the women and blockchain community and for social impact and social good.  So we had the marathon, the hackathon for social impact, in New York.  We 1,500 participants or so, with several solutions for blockchain as a technology to prevent crime in several ways and one of the aspects was extortion and this is something that always seems to me incredible how we are attacking the problem instead of tackling the root of all evil, I would say, which is our very own online behavior.

I guess when Facebook does this ‑‑ this invention to ‑‑ innovation about trying to find ways to remove content from the platform, I'm quite happy about it because I do believe the human element is important, but the life of a content moderator in a social media company is horrible!  There are cases and cases of people falling apart from psychological trauma from removing images online.  So this is very important that we tackle this with technology too.  However, the reason why we are having this multi‑stakeholders dialogue is because the prive a sector needs help to Cree a technologies such that's.

It is controversial that we have a database of images for Facebook to sort out itself, what is the local context of someone using their credibility over their image?  This is not even something that governments can do.  How would a company do?  And on the other hand, from us blockchain developers, offers of content, civil society organizations, we're not perfect ourselves.  We do have still to study a lot, how to tackle this but there is no quick fix.  I will come back to what Suma just said.  Techno fixes do not fit.  Band‑Aids will not let this go away.  It is about education and it is about long‑term dialogue and programs to make this work.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Okay.  Do you have a reaction?

>> I have just wanted to address a couple of the themes that came out and then I know you want to add something as well.  I brought up some really, really important points and I think this does take a village to try to solve for it.

Won't be easy.  I mentioned this before, partnerships are super, super critical.  They know the experiences of these people who are going through this.  We need to do more research.  And like a lot of antic data is out there but we need to know what are the experiences and what would really help and find a solution for it.

We talked about the mental health of moderators and I'm so glad you brought that up.  This is one of the hardest jobs probably today.  I have a lot of admiration for my colleagues who do this kind of work.  There are industry best practices.  We have been focused on this for sometime.  It's something we have to focus on and we have some great best practices out there.

I'm not sure how much we want to be diving through some of that, but the number of hours that a review should spend.  There's some coping mechanism like the jobs you wear on the job or the work.  Before you go home you change and so you have a set of clothes which are designated and you associate with your clothes and then you switch off and you go home, or making sure that people have the support, the counseling services that people need to have these conversations.  Are it's super traumatic.  Thank you for shedding light on that.

>> CATHERINE GARCIA VAN HOOGSTRATEN: I'm just going to take us back to basics and try to sort of broaden the conversation and to leave other social media platform because it's not just on Facebook.  The research that I'm trying to finalize at the moment, I'm reviewing input from 50 active women, a lot of whom are high‑profile cases that we all know.  I asked them a simple question.  When you reported, s, y, z, did social media come back and ask you what are your needs?  What are the policy solutions?  And they also said no.  So we can solve this with a group of coders and engineers, the majority of whom are men who never experienced this kind of violence, in a room without consultation of people who are at the receiving end of the violence.

I know the problem is very complicated but it's not as complicated because we haven't done the proper consultations.  The partnerships should not be with organizations who work on cybersecurity or privacy but with the very woman transgender none conforming individuals and men, but, yes.  Yes, people whose voices have been invisiblized in this process.

>> So that's why we have question number three, actually.

>> CATHERINE GARCIA VAN HOOGSTRATEN: In light of we were building up on ‑‑ in terms of comments and remarks and so indeed, there are some blind spots mainly involving in urgent social and ethical issues involved by the use of artificial intelligence or Internet of Things as we all have a stress.  So ‑‑ and that obviously has an impact on important and crucial topics such as quality autonomy, human rights, human dignity, responsibility.  And therefore, in balance of power between consumers and businesses or between members of public and government bodies.  So we consider ‑‑ it's cocreator users.  So that's why we posed the following question.  Can users participate in the use of ‑‑ the content moderation and do we need to do it on accountability of design approach.  I will be more firm with the on site participants and I haven't heard anything from online.  So on‑site participants have the priority to answer to agree are or don't agree.  These two questions be posed here?  Any point of convergence?

>> PARTICIPANT: I'm from the Oxford Internet Institute.  So the use ever users themselves is an interesting kind after decision to content moderation assigned, and there are platforms around the world that already do this to an extent.  So one example is in China, where they have a system that allows users to flag content and participate in whether the adjudication is offensive or not, and this is the kind of content we are talking about now, stuff that sits open the border between what is illegal or violation of user terms and stuff that's just reprehensible, the kind of thing that you don't want going on on social media.  And essentially what happens is that users are brought in to the fold and users vote essentially on whether a piece of content should be removed or not.

But, I would be interested to hear what exactly you would think this would look like in the case of images that we are talking about here, because are you going to put an image in front of other users and ask them to judge whether that should be seen or not?  That seems counterintuitive in many ways.  I was intrigued ‑‑ I wanted to throw that out as one way of bringing in users in an age of increasing automation when it comes to content moderation.

Thanks.

>> RENATA AQUINO: So I have a rely for you, Renata in the women and the blockchain community, this is something that we discuss a lot.  So how can we use not only blockchain but technology in general to create social impact and create social good and how to deal with counterintuitive paradigms and the recognized images.  One of the simplest issues is the same with any other technology is that users are never consulted.

As much as we had a human‑computer interface, it's cheaper, faster, to build a quick fix or a blockchain app that something would very ‑‑ I won even go with ultimate, actions in bulk and trying to do case studies.  One the things about Article 13 is where are the case studies?  Where is the measure of impact of this?  So a piece of legislation is moving on, without studies that this piece of information that they are advocating for is useful.  Where are the studies that make this come to life?  So this is amazing.  We have to redesign the whole process and this comes from a whole production engineering initiative.  You have to start with the users.

>> I guess the spreading or the leaking out of images.  They can exist that other users spread this.  So now in the Netherlands we start to campaign and say, listen, keep it to yourself literally.  If you good get this picture from somebody else, stop spreading it.  The power of stopping this process is within the community itself.

And we need to educate people, just to stop spreading and stop thinking this type of extortion.

>> Can I play the devil's advocate a bit?  Because I'm an anthropologist.  You said we design and we don't know what they are going to do with it.

That's very true and I find interesting use of technology during field work and one of the things that really mothers me in sites like this, bunt where activists are together and they say it's you education.  And as a social anthropologist, it's ‑‑ we can't change culture, in that sense.  We do make social context.  We don't do it exactly the way we wanted to do.  So are of course, it's important to have activism and to have educational purposes, but it's really important to understand that.  This is the hard environment thing to change.  I have another question about sensible content that I found during my field work, is that we assume most of the times that we are talking about images, pictures and sometimes we are talking about dialogues that are taken out of context.  So it goes with what she said and what another person has said about context.

Sometimes it's about what you said about women who were very vocal online that they take the text or print the dialogue, and that becomes a trigger for violence and that poses another problem, because how do we crypto ‑‑ use this technology?  And ultimatum tools.

>> It would like a minute to demystify solutionism, where Semanur brought something.  This is an offline problem.  It has always been an offline problem.  What we have here is that technology scaled it and now we are tried to go find a way to descale it.  To the person who raised the issue about balance and freedom of information.  In the actual state of the art, in's no solution without the man in the middle.  Automated tools are not going to work for everything.  My fellow Brazilian nation, the first ‑‑ a snapshot, the definition of a picture itself may cause an original picture to go through the block or to be blocked and the print not to.  If we are talking about speech recognition, which the dialogue has evolved incredibly, how do you filter that without the man in the middle.  By the sentences itself?  What about Chinese with the slight intonation?  We are always talking about the solution with a man in the middle.  We con make it residual so that the most evident violations could be dealt with in an automated way.  I want to lay down the ground for the proposal for the blockchain structure, for example.

Social mid‑year platforms could comply with that, or search for the hash that we are talking about here so they wouldn't have possession of that material.  It could be worked out or proven wrong but that's something we could do in this sense.

>> CATHERINE GARCIA VAN HOOGSTRATEN: So thank you.  So to wrap up the fourth and last question and here once again we invite all the on site participants to react and, of course our discussions.  So avoid deepening the divide, who could benefit from the new technologies all the ones we are mentioning, blockchain and algorithms and so on and those who are able to access them, the policies on the new technologies may need to pass provisions on.  So any reaction with convergence or divergence from the on site or our discussions?

>> PARTICIPANT: Hi.  I would like to go back to the law enforcement scenario and education increasingly data in law enforcement, we are reaching a crossroads, think it's completely legitimate to have campaigns and to try to prevent the numbers that are staggering about the rise of the uses of the precise images but a lot was said here about the challenges about freedom of expression, privacy, and dialogue.  I think one of the things that we need to discuss and also from ‑‑ with the private sector, as an ally is dealing with data.  Yes, there was a case, I think, in Brazil that mobile was filmed to prove a dialogue.

>> Yes.  Yes.

>> PANELIST:  Necessity have to idea how far their data goes and they are reaching a massive amount of data and what we really should do, to tackle this again would be case‑controlled studies.  So I'm imposing here really the problem.  Do we see a step back, instead of getting out that and persecuting all the people being trying to ascertain the best procedure to go back to this?  Just reacting.

>> PANELIST:  I'm not sure how we go to that point about the law enforcement.

Opposition is that, yes, inside we live in a new digital world.  Some stuff that are sensitive should not be put online.  You can mitigate the issue, but you enter new dimensions and you don't know the rules and then people make mistakes.  You have organized criminal groups that are taking advantage of this, lack of understanding of the technology and the impact on your life.  I think you should accept the fact that you don't share images online.  You will lose control.  I think you should mitigate the production of this type of images.

>> I think it's interesting what you say but coming from another perspective, the more you tell people what they can or shouldn't do, the more this gets appealing.  So in social context ‑‑ perhaps it me being an anthropologist.  Whenever you saw pictures online and to me, I'm over 30, there were some things incredibly embarrassing or playing with what is dangerous.  So human behavior, there's walking the fine line between danger.  They may want to post it more and, two ‑‑ because they do.  It's true, believe me.  Two.  We post things in different context, and I think it's important to talk about this bust I found that the more you tell ‑‑ we sound like moralist to many people when we talk about the danger of Internet and platforms, because sometimes it's exactly that, what they are looking for, to be avant‑garde and very adventurous.

>> PANELIST:  Yes, I want to add to that.  First of all, I believe it's good to educate and I have children myself.  I know if I tell them not to do stuff, it's snot that they go out to do it.  It's good to educate and tell them but please don't blame the victim.  We should stop doing that because that's exactly what this whole extortion is driven upon.

Also, we should realize that there are pictures who are intentionally being put on.  There are pictures being stolen from Arthur phone and there are pictures and this is the era we are heading for that will be formed, will be made with the new technologies that we have.  Photo shopping is so good if we want to extort somebody, we don't need to seduce those people to do so.  We are heading for a big problem, even bigger than it is now.  I think it's would be good if we can pull our strengths together because we can be put on the lines or this or that.  We all agree we need to do something.  Who will take the initiative.  I think we need to move this forward.  We are not going to want to sit hire next year saying, yeah, was a really good idea we had last year.  This is my question to you.  How can we scale this up, this idea to make sure that we have an independent body who will look into this and maybe get funding and look into the techniques that they are, who is ‑‑ yeah.

>> PANELIST:  I was going to say independent body or instructor, in case we find a way to move on with the blockchain, right?

>> WALID AL SAGAF: Yes, I mean this will be my last intervention.  We have no time.  Walid again.  This question is particularly relevant.  If you add the word usability to security.  As we heard earlier, mobile devices get hacked into.  There are passwords that are weak and unencrypted traffic data flows that are happening.  If you tell the user that you need to a password that's 12 letters long and then digital.  This is what we often call unusable security.  And when platforms introduce this as good guidelines and don't take into account the user's perspective, how easy and usable it is.  They are virtually doing nothing.  It's just giving a manual to follow.  So that needs to be enhanced and I would like to say there needs to be provisions on accountability.

At the end of the day, even if you have these measures and you don't have accountability in the provisions themselves that you have consequences for leaking the data for the NSA developments that have happened.  Then this guide can be repeated over and over again.

>> KARUNA NAIN: You brought up some pretty important points.  One of the them is making sure that we are speaking to the group and making sure that we are speaking to the organizations and I assure you, thinking in isolation, that's an important process that happens.  The second piece of it is technology and security.  You know, we talked before to a DNE.  I don't know of a single case where they have managed to reverse the hash to get to the image.  It's impossible.  So there is really strong technology solutions out there.

You know, every technology ‑‑ I will call people solutions because they are the people who have the answers.  They are going to these experiences and people who are immediately ‑‑ they know who is dealing with these women who are coming to them and telling them about these challenges.  I have think we can get to a solution.  We need to put this to go and come to that solution.  So ‑‑

>> CLAUDIO LUCENA: Just a few seconds to build upon a very useful insight, about knowing or giving up and driven out of this process.

After you gave your insight also of young people, because different generations use these platforms differently.  This is interesting and the problem is very serious, but worse, it's changing.  It changes from generation to generation.  They are using the platforms in another way.  It's not only difficult, the problem is it's hard to tackle and from an evolutionary perspective.  Thank you very much.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Have anybody else a question besides ‑‑ we have only two minutes left.

>> PANELIST:  Thinking of being more pragmatic, I think having more psychological, sociological and anthropological works on how people use social online platforms and why do they use them for taking into consideration cultural differences, age, gender sexuality, a clash, racial, religion, we have to do more of that and very recently professor Daniel Miller at University College London correlated worldwide research called why we post.  It's awful online.  It's very interesting.  I think there are 11 books on how people in different communities and places and why do they use social media platforms.  And so it's called "Why We Post."  It's just an idea and I think we should do more than, especially companies and the state.

>> CATHERINE GARCIA VAN HOOGSTRATEN: Thank you very much.  With all of that said, we will close down the discussion.  There's no more reactions.  Oh, I'm sorry.

>> So, yeah, I would like to ‑‑ my response on the law enforcement perspective, I think we had a pretty good dialogue because it's exactly what we have been discussing we need, yes, people' solutions and technological solutions to come together.  And I think we need to tackle about others.  And I encourage to look at this in the IJAS.  For example, the same way that we are having people building businesses out of other ‑‑ others images, we have content farms and the threats that these content farms bring to democracy in other aspects as well.

And most importantly just to sum up and clarify this, it isn't ‑‑ for example, the women in blockchain, it originated from the simple observation that most of the blockchain markets is male.  So every conference we would go and have huge manuals and every company, you would have very few women developers.  This group originated almost as an impromptu action.  But there are actions that we can do that we need to have longer term reflection and that would make a huge difference on this scenario.  So it has been a great discussion, and I hope it continues.  Thank you, Renata.  Thank you for your attention and to you all, your collaboration and your participation here as well.  And we were ‑‑ we process this soon after IGF and so please keep posted open that.  Thank you very much.

(Applause).

(End of session)