IGF 2017 - Day 2 - Room XXII - WS184 Surveillance from the Margins


The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> ALEXANDRINE PIRLOT DE CORBION: Okay.  Everyone.  We will get started.  There is a slight change from the program online to the program printed.  People thought it would start at 10:40.  But we're starting a little bit early ahead of 11:00.  My name is Alex.  We advocate and litigate to do right on the privacy working with international network partners.  I upon happy to be moderating this session today with this distinguished experts sitting next to me who will be exploring through their different context and work they're doing at national level, what does it mean to be ‑‑ what is on the margins and who are individuals who are subject to surveillance in terms of on the margins, it is individuals who are in vulnerable positions or experiencing exclusion or stigma at an international level or local level or how are they experiencing different facets of surveillance that we're seeing across the world. 

So I'm going to hand over to David Kaye, the United Nations special director of the freedom of expression and opinion of the U.N. and he'll help us frame the discussion in the next hour before hearing from our national experts.  Thank you, David. 

>> DAVID KAYE: Thank you, Nighat and Alex.  And everyone for being here.  I will only speak for a couple of minutes because there is great people on the panel that will talk about the impact of surveillance on national communities and local communities.  I think we tend very often to think about surveillance from sort of the large‑scale sense of how surveillance operates, what its purpose is, how it is connected to maybe large scale counterterrorism operations, so forth.  One thing we have seen over the last several years is essentially a decentralization of the surveillance industry. 

Go back to 2013 and the time of the Snowden revelation.  Also, totally coincidentally, it is the time my predecessor, Frank LaRue offered his report on freedom of expression.

We go back to that time, and much of the conversation about surveillance was essentially the broad scale, broad scope of surveillance, particularly by the United States and its partners in an intelligence operation.  And also, the broad scope in the context of a surveillance operated to sweep in vast, vast amounts of data without respect to individual privacy, freedom of expression, freedom of opinion, other fundamental rights.  I think one of the things that changed over the last couple of years, while that is still in an ongoing conversation and an ongoing debate, we have seen particularly through the work of organizations like citizen lab, University of Toronto, exactly how surveillance has moved from this mass surveillance that Snowden disclosed, which again, still exists, but moved also to highlight how individual companies around the world have made spyware, surveillance technology available to anyone who can afford it, and the price is dropping.

We have seen this in particular in places that are repressive, and where the tools of several surveillance that are purchased from companies around the world, but in particular from Western Europe, where the tools are being used to highlight and target the work of journalists, activists of regular individuals who might be for one reason or another in a position of dissent from government.  I just spent about 10 days in Mexico where the hacking and surveillance of mobile phones, you know, by mobile phones owned by journalists, by public officials, even by international actors was a real scandal.  And yet the possibility of this surveillance and the lack of regulatory environment on this type of surveillance is more and more obvious, I think.

So I think it is useful ‑‑ at least for us ‑‑ and I think this panel will really focus in on how surveillance moved from this mass set of surveillance to very targeted surveillance that has a very direct impact on individual expression and on individual privacy. 

And I think that one of the ‑‑ one of the questions that we should be addressing is not just ‑‑ which I think we will get to here, the question of who's impacted, how are people impacted, and what rights are influenced and undermined by this kind of very personal direct targeted surveillance.  But also, what kind of regulatory mechanisms should we be talking about to deal with the spread of this kind of ‑‑ this kind of technology? 

And sort of a corollary to that question is whether it is even possible to restrict its spread.  Because obviously, it is the kind of technology that may not be subject easily to the kind of restriction that other forms of surveillance might be subjected to.  So I'll just close there.  I'm looking forward to hearing everybody's input, and also really encourage people to be thinking about these questions, not just of who's impacted, which I think this panel will get to, but what are the solutions we might be adopting to undermine it?  How can we do it at the local and national levels and what are the international tools we might have to deal with this kind of surveillance?  Thank you.

>> ALEXANDRINE PIRLOT DE CORBION: Thank you, David you have been working with others and connecting with rights and someone with the United Nations that we need to do increasingly.  We can't talk about these in isolation, they're all interconnected.  It is great to see the initiatives as well.  Thank you. 

I will hand over to Joana Varon, who is a researcher, advocate and founder of human rights in Brazil.  She will talk about the work they've been doing on mega events and the legacies of some of the surveillance technologies and systems that have been put in place in Brazil.  Thank you, Joana. 

>> JOANA VARON: Thanks Alex.  Thanks everyone for the presentation.  I will show you quickly the works we have done at my organization.  This is the time line that we map the changes in the legal environment that allows us to be tracked.  And how it changed after a serious of mega events that were announced in the country. 

So here, you can see before we had this law on the conception of telecommunications and some regulations and Brazilian intelligence agency.

Then we had the announcement of the Olympics, World Cup, American games.  And this is what happened right after.  In green, you can see the new institutionality that was created and legislation that were created, that both the legislation allow for other practices or other agencies to engage in surveillance activities, while the institutionality that was created broadly speaking, was mostly about centers that would put together those agencies, like the operation in Rio.  But also the cybersecurity and integrated center for command and control.

So what happened is that before, while we had different surveillance powers, according to the agents, like the shortened graphic, which is the intelligence agency Brazil, which have some powers, while the police and judiciary were the ones that have more surveillance powers, concerning access to user data, covert ‑‑ providing agencies with undercover agents, the conception of communication or things seen in surveillance.  It was all separated.  But then for the mega events that they got together in the centers for command and control.

And we have no clarity about how they're sharing the data.  And we have data that they could not ‑‑ shouldn't have access. 

Then during those events and after, we could see that the new legislation ‑‑ we had the legislation on antiterrorism, that was pushed from the World Cup.  But we also approve the legislation on criminal organizations, the criminal legislation.  This particular piece of law is being used to target social movement.  So we have cases in which people from the land workers movement have been targeted.  And also organizations that defend human rights.  And to end my presentation and connect with what David was saying, so we see ‑‑ we could see this research a change in the legal institution or scenario on one hand.

On the other hand, the country became a huge market for the surveillance industry.  So many of the affairs for the hacking team, they started to happen.  They went there.  And they sell their solutions, and they actually bought a lot of things.  We try to use FOIA requests, but it is hard to get a sense on the materials, particularly those pertaining to surveillance Internet, because we also approved the view that is about ‑‑ that is public franchises that the same events, government officers to declare that they have bought surveillance equipment.  So it is very hard to have a sense of what they have spent on.  But then, if we go to those cases and analyze the investigation practices, then you see Facebook has the patrol that is being used through Facebook. 

So you go to the business one file, the part of the process that accusing protestors during the World Cup, but then here is another one in the human rights defenders and they include that they are monitoring their Facebook pages.  And requesting WhatsApp.  There was a case in which group monitoring interception of communication they got the password of people from the password for another social network and managed to enter.  So this is a few that are not properly regulated.  And we never approve why we have all this.  New abuse.

We never approve a view on data protection.  So this legislation is happening on one hand.  And the other hand, we might have bought a lot of equipment that we don't know how they're being used.  And then third thing is the extensive uses of monitoring of the social network, request for assessing WhatsApp.  WhatsApp has been blocked in Brazil once because of that.  And to finish, now we have this new trend, the latest about fake news, our Supreme Court for elections have assembled a committee that has the army intelligence, the Brazilian intelligence agency together with that court.  And the fake files in the social network. 

I think for this we develop, in this layer, in the country that has a coup and added this layer of monitoring in the elections, involving the army and the intelligence, they are related to that.  It's another very harmful thing to think about surveillance. 

>> ALEXANDRINE PIRLOT DE CORBION: I think Joana's presentation and in Brazil it highlights the surveillance and lack of transparency of the regulations, what is in place in terms of legal framework.  And also the capabilities of government and who can then access that.  It is to ensure national security and social order, but what happens afterward?  How do we work to dismantle such systems in place.  There is little information including right it is and other organizations.  There are huge amounts of money being spent on this.  It is very unlikely that they would dismantle them.  What remains and how that is used, I think that is the example of in terms of how this is now used to undermine other fundamental rights including freedom of assembly.  Which is generally civic engagement, when it comes to electoral processes as well in terms of democracy.  I was saying we cannot be speaking in isolation about the rights.  Once one is infringed, it is likely it will infringe other fundamental rights.  Thank you, Joana for that.

I will hand over to Nighat.  She is the creator of this ‑‑ four or five years, her organization has been working.  Nighat is a lawyer, human rights activist in Pakistan and digital rights foundation has been pioneering this work in Pakistan and working at the intersection of Internet freedom, women, gender, and technology.  Nighat will share her experience.  Thank you.

>> NIGHAT DAD: That you, Alex.  I will talk about gender and especially technology in Pakistan.  When we talk about surveillance, we talk about the larger issue.  And we tend to forget the voices of the marginalized community or the vulnerable communities who are not even heard in places like these. 

After working for years in the age of digital rights, we realize that surveillance isn't an isolated issue anywhere in the world.  We talk about the data and big‑tech companies, we forget the communities that are not having a voice in this.  When we add an extra layer of gender, and we see the minorities and marginalized group and those that are fearing for their safety in public domain.  Pakistan has particularly been insensitive toward the gender and sexual minorities along with other persecution that we frequent witness back home.

While we talk about surveillance I'll talk about how culture and tradition plays out in conservative societies like ours.  The surveillance that starts from our home, when we're not allowed to lock our doors to when our takes our mobile phone or answers our calls because he claim or the male elder of the family claims that he is our guardian and should protect us at all cost.

The surveillance at home translate into larger social surveillance.  Our neighbor are keeping an eye on us.  Our relatives are dissecting our clothes, the people hanging out with us, calculating every minute when not home. 

If I talk about the woman perspective of Pakistan, we normalized this behavior in our own head.  I have ‑‑ I'm sorry.  Yeah.  So ...

I'm sorry. 

In our society, a woman has been given a rationale always that they trust her, but don't trust the world.  So their own trust issues manifest on the woman and confine her legal in a glass ceiling.  The main idea for lobbying for cyber crime law was the same that we need to protect our daughters and sisters that are committing suicide because of online violence against them.  Now this law is enacted and mainly used to surveil and contract down on political dissent.  Surveillance from home translates to larger social discourse and then translated into the entire individual.  The abusive partner who controls your life, including mobility and who you talk to, that is as bad as a hacker, that is filming you while you navigate your private space.  The social surveillance, state and nonstate surveillance result and are interconnected.  We talk about how they should have mass surveillance by the state and nonstate level and how it is illegal and whatnot, but they still happen.  When they do happen, you don't know who is watching you or taping your phone at any given moment or who has access to your data at all time.

In Pakistan, this is a common occurrence when someone breaches the domain of your authority and contact the woman and moral police them.  The idea that only a sister or mother should be protected is problematic, because it puts others at risk.  Father dynamics play a part.  This person has your personal information at your disposal.  You don't want to be rude with them because social surveillance and your conditioned habits of being the bearer of family's honor is at play here.  The lack of good governance and legislation make things worse because there is no data protection law and available laws legalize reasonable surveillance without defining "reasonable."

We as women activists and journalists are more vulnerable to the watchdogs, moral policing woman when we're at the forefront of a movement with no protection at all.  We like to defy all the power dynamics, but it becomes tricky when the power we're attempting to challenge is not just one person, but a whole society that likes to keep a tight grip on how a woman are supposed to behave.  It is so important when we talk about the right to privacy.  The culture, the tradition we experience in our societies.  It is so important to look at those cultures.

This culture, the religion, tradition basically play out while making legislations and laws in countries like Pakistan. 

>> (Not on microphone)


>> ALEXANDRINE PIRLOT DE CORBION: I don't know if anyone heard me.  I know the context in Pakistan shows that when looking at social surveillance and when we know the policymakers, they're supposed to be drafting the laws, they're meant to protect us are members of that society.  So if we want better laws we need to work on the individual level, the societal level to ensure that this is translated into legal safeguards and tech safeguards and enforced.

I think as David was saying earlier, there is a lot of focus on state surveillance and then we tend to focus less on social surveillance and things that are probably nearer to us for a lot of not just women, but different groups in society that are being marginalized and discriminated against.  Thank you, Nighat for that. 

I will hand over to Amalia Toledo who works in column ‑‑ Colombia.  She's a researcher at charisma.  Previously, she was working on issues of freedom of expression and protection of journalists, which plays in well with her presentation today where she's going to be presenting some of the work that charisma has been doing on women journalists and privacy.  Thank you Amalia Toledo.

>> AMALIA TOLEDO: Thank you for the invitation, Alex.  Yes, I want to share an experience we had in charisma.  Something we implemented three years ago, we wanted to understand how female journalism and journalists experience digital violence, digital surveillance.  And to identify manifestation of the misogyny on the Internet and its consequences.  But in a country like Colombia where journalism is still a high‑risk profession, threats and violence against the groups have been normalized.  It does not recognize when journalists are suffering general‑based violence or gender‑based surveillance.  Those for women journalists are often different from the male colleagues. 

In Colombia including surveillance and activity against journalists are understood as part of the job by the type of surveillance.  The type of surveillance experience by women and men that practice the profession are not recognized or distinguished.  The reality is there is a difference between the cost, the job, and the form that is used. 

Online surveillance against woman journalist, looking into the woman's personal and family relationship.  Who is her partner?  If she has kids, so on, so forth.  In addition, it pay attention to her physical appearance and intellectual abilities to be used against her.  But above all, it is very much sexualized.  The body is the weapon and the battle field.  This surveillance that most of the time take the form of an intimidation is not a fallen idea or argument, but rather on the fact that if a woman who speak out and express her opinion in public spaces. 

We conducted several focus groups with female journalists from around the country.  The discussion we have with this group so that these victims are just knowing a woman of these attacks has an affect on their own journalistic expression and the woman's behavior on the Internet.  Particular when it turns into self‑centered.  Some of the consequences identified were that some woman journalist decided to close accounts in social media or other public media are more careful about what to say, what to publish, for have asked for information on the areas they work.  They are preventing the public debate from becoming personal or violent or abandoned the journalism temporarily or permanently.  The fear that it will materialize is the engine for this habit.  It is causes high level of stress, loss of income or their job.  This has the effect of reducing the representation of woman journalist, woman in digital journalism.  Even more, it is encourage self‑censorship, they will hear female voices in journalism and reducing the participation of women to not so public spaces or resonate in an area where technology is crucial for journalism. 

When we talk about how female journalists experiences defer from their male counterparts, we can highlight that the aim is to generate anguish, damage and generalize the victim's self‑esteem and general fear about what can happen to her or her loved one.  When surveillance is carried out on a female journalist, there is no doubt that the gender stereotypes come into play. 

Those that continue the male domination and female subordination.  And it is occurring when we figure out what trigger most of the surveillance against female journal pimp.  There is a myth conception that women are better to contain concepts.  Instant male journalists are better at dealing with hard news.  That is policy, judiciary, work.  The moment a woman cover any upon these hard topics, she become I stated target of mostly male audience.  This is where their experience are so different from what triggered the surveillance or the attack, they form mistakes and consequence.  What I have been talking about is just a small sample that barely touches the surface.  How it can be used to compromise privacy.  How does surveillance from different access affect them.  How can government mainstream gender into mainframe practices into other concerns.  These are questions to be considered and confronted.  There is a need to understand them in order to face them and provide better solutions to empower woman and essential standards. 

>> ALEXANDRINE PIRLOT DE CORBION: Thank you for sharing that perspective.  I think what the case in Colombia reflects is looking at what is the problem analysis.  There are different layers that you are talking about.

It is not only being journalists, but female journalists covering issues that may be seen as more controversial and the implications vary.  The infringement on the right to privacy.  That leads to other implications in terms of sexual harassment and other types of violence as well.  So all of the different contexts that were shared are very different.  So it raises the question as to, you know, whether one‑size‑fits‑all solution is possible, given how different every context is, the actors involved, the targets of those violations or interferences with rights.  So what does that mean in practice, if we are going to be advocating for better safeguards at the different levels.

I know David has to go in a few minutes.  Before he goes, I wanted to see if he had final reflections on the three presentations? 

>> DAVID KAYE: I think I would say a couple of things.  One is each of the presentations really demonstrated a kind of change in surveillance, and also a change in the way we should think about surveillance. I think it is important ‑‑ I think certainly in a forum like IGF, we're extremely focused on digital surveillance.  But digital surveillance, as Nighat's point made maybe most clearly.  Digital surveillance is a part of broader tools.  There is personal surveillance.  And there is family, cultural, or local government.  I think it is important for us to step back and put the digital surveillance many of us have been focusing on into the broader context.  Because actually dealing with and addressing the problem that I think everybody up here addressed requires not really just digital solutions, but require broader solutions that get at surveillance as a general matter, not just as a digital one. 

>> ALEXANDRINE PIRLOT DE CORBION: Thank you for those reflections.  We're going to actually open up to the floor now to give us an opportunity to have a discussion as well.  So if there are any questions for our four panelists, please switch on your microphone and maybe introduce which area you work in.  You don't have to give your name. 

People are being shy on the second.  Go ahead. 

>> QUESTION: Hello, Fabian from the copy fighters.  Have you been researching some about the new EU copyright reform?  The article 13 in that reform would require all platforms to have mandatory content filtering and I'm very concerned about that regarding the surveillance factor.  So ... your thoughts on that? 

>> ALEXANDRINE PIRLOT DE CORBION: Any of our panelists familiar?  You haven't been?  Has anyone from the audience have been looking so we can start maybe more of a conversation? 

>> DAVID KAYE: Something briefly.  I think you are right.  What I understand article 13 of the copyright directive to do is flip around our normal presumptions in terms of content being uploaded first, and then maybe subject to some notice and take‑down process.  The way article 13 is drafted now is to flip that around and essentially require the platforms to prevent the availability of unlawful content or violative content, in one way or another. 

So there is very serious problems about that, for the reasons you mention.  Although it is interesting to spell out how that has implications for surveillance.  But it certainly has implications in intellectual property for how we think about our ‑‑ you know, what has developed over the last 20 years or so of a vibrant notice and takedown process.  And what it means, if platforms have a requirement to prevent the uploading, essentially, of content.  And what it means to filter content in that way. 

I mean, I would be interested to hear more about how you think that has implications for surveillance.  But it certainly has implications for freedom of expression.  And maybe particularly artistic expression.  And if we go beyond that to thinking about how the use of filters and their development in one particular case might have a kind of spillover effect into other areas of online life. 

>> PANELIST: So yeah, regarding the filters, it is a part of surveilling.  You have to surveil everything uploaded to a platform.  Concerning minors in this case, for me, coming from an L.G.B.T. community.  This would actually not be one thing that we could implement because we don't have the budget for that.  So as an L.G.B.T. online community, we would have to shut down.  This is a great problem for like all minorities, but I mean, regarding the surveillance factor, filtering is a way of surveilling, so yeah. 

>> ALEXANDRINE PIRLOT DE CORBION: Joana, you wanted to respond? 

>> JOANA VARON: I am not familiar with the situation on the copyright, but what I want to add is I see now people are talking about AI, that is a trend to connect this with filtering and surveilling.  And I think even the technology doesn't reach that stage, but even if it does, it is going to fail because it will be programmed with the values for blocking content that we already have problems with human beings moderating.  You know?  So be it for copyright purposes, be it for fake new purposes, be it for moral purposes, I see this trend is very dangerous.  Like we have this debate in the responsibilities of intermediaries and Internet for long years now.  And this trend is okay, now AI is going to solve everything, like the most (?) thing we could think about. 

>> ALEXANDRINE PIRLOT DE CORBION: Yeah?  Question here in the front.

>> QUESTION: Yeah, powerful stories from different parts of the world.  My name is Johan, I am from Pakistan.  I represent Asha and (?).  It is not a simple solution.  As David said, it is not a one‑size‑fits‑all sort of solution.  But we all know the problems.  The ladies have pointed out how severe the problems are.  Are we looking at any short term, midterm, long‑term solutions to the problems?  Because, yes, talking about it is important, but looking at how we can express them seems to be something we should collectively be looking at.

>> ALEXANDRINE PIRLOT DE CORBION: Thank you for that question.  I will ask each of the panelists to talk about the work they're doing and dealing with the problems and integrating the findings into more concrete actions. 

>> AMALIA TOLEDO: In the case of charisma, we are learning how to mainstream that into our work, to make better assessment when we are looking into new revelation, draft law or support.  You know, that is the first thing to understand.  I also together one part is just knowing how to do gender assessments.  But at the same time ‑‑ I forgot what I wanted to say. 

No, but at the same time we are trying to gather more evidence.  That is what I wanted to say.  We want to gather more evidence, because we don't have it.  As I say, this is just a surface.  We're working with 25 female journalists.  So it is just a very small sample.  And we need to gather more evidence.  So somehow highlight through either the private sector, but also the public sector.  You know, this is what we have.  You know, we have to base our decisions, yeah, on the evidence.  This is what we have.  Maybe we don't have the capacity to gather more evidence, but we can insist the others with more resources than us have something like that.  You know?  It is very difficult to have one solution but it is a way we are looking for ways of helping to find new and more innovative solutions.

>> PANELIST: I think there are short term and long‑term solutions.  When we think about countries like ours, the long‑term solution is to break stereotypes around social surveillance.  It will take years and years, even starting the discourse and discussion around that is something problematic for people to digest.  And we ‑‑ and you as a very active stakeholder in fact know we have started this discussion in terms of how the legislation looks like, but also in the different communities, if they are civil society or try the organizations.  Like how the woman's experience, in the social spaces how to look into those experiences.  We have produced, like charisma, a research of the woman journalist experiences of surveillance.  So I think more ground breaking research is leaded toward that.  We have policy work to ‑‑ but even in homes and our circles, I know as families and human rights activist in Pakistan, when I talk about surveillance between human rights defenders they have internalized it so much in their head, they actually say no matter what you do, like the agency or security agencies or the state will surveil them.  So they have already, you know, like normalized the notion.  We need to challenge the discourse and stereotype that there are some solutions.  I also think that the demand of respecting of one's privacy rights should come from the public.  And that's something that we are lacking in Pakistan.  That is because of the culture and tradition that we have. 

>> ALEXANDRINE PIRLOT DE CORBION: Thank you, Nighat.  Joana, if you want to mention a little bit about what is at work in Brazil.

>> JOANA VARON: So, sorry.  It was more about mapping the playing field or tracking the changes from the governmental perspective ‑‑ the perspective of the governmental surveillance and how it connects with the platform and how the surveillance tools go beyond just government tools.  But there is another work that we are doing on story telling, on surveillance capacities, not only from the state but from this ‑‑ but the capitalism, it is on another platform that outcome is with flag of stories about state surveillance is mostly about platforms and the massive collection for their business models operate to surveil us. 

Automatically, those lead to a lot of strategies on online gender violence and a lot of experiences as mentioned by Nighat.  Or even targeting journalists.  So then those two projects are projects in which we map the feuds.  We try to tell stories about how this is pervasive, how surveillance is not safe.  It is very obvious, I think for the audience here, but beyond the Internet community, I think, there is a lot of work to do in terms of convincing people that privacy is not that.  It is the discourse of you have nothing to hide doesn't apply. 

So there is this part of the work that is about mapping, creating awareness, but then we do direct advocacy in the policymaking field, but also digital security training, particularly focused on women as well as L.G.B.T.Q. I.  Brazil has the highest rate of killing trans people.  So we focus on a lot on that community.

>> ALEXANDRINE PIRLOT DE CORBION: Before taking the next question, I want to reflect on what is already said and share responses in the international and work that we do.  One thing that keeps coming up that was mentioned was around narratives and discourses.  I think as we're proceeding with the work we're doing already.  We need to be challenging those dominant narratives and using our own terms and stopping to use the stems that are ‑‑ the terms imposed on us by companies or government and defining it in our own terms.  I'm not saying on the national level because there are local levels to be reflected.

We need to rewrite the narrative and use it for our own benefit to advance our mission and values.  The other one that was mentioned here as well and I was on a panel yesterday with ‑‑ around research or perceptions of privacy in eight African countries.  And low level of awareness of people not being concerned about privacy and for me, it was not surprising to see this low percentages, because when we know that there is just ‑‑ it reflects a lack of transparency as to what surveillance practices and policies are actually in place.  So if you don't know how your right to privacy can be interfered with, maybe you are not so concerned that it is being because you don't know it is happening. 

It is all operating a little bit in the shadows.  No accountability of how these are being undertaken.  Unless we can break this cycle and to provide more information in the public as to what is being done, supposedly in the name of national security or security more broadly, then it is going to be very hard for people to be aware of how it impacts them as well.  So I think that is very important.  Any other questions? 

>> QUESTION: Just to add on to what I was asking and the conversation, people that are activists or those involved in this, it becomes worse because there are not enough voices talking about it.  So perhaps, a way needs to be created perhaps, at university or school level where kids need to be aware of the fact that their privacy can be compromised, surveillance is going on constantly.  They need to speak up about it.  The more voices involved in the conversation, the more chances that policymakers will pay attention, especially with the voting age being 18 now in places like Pakistan where they want that vote and with elections happening in a few months from now.  This is the right time for us all to get involved and see if we can get the young people to get involved in the conversation. 

>> ALEXANDRINE PIRLOT DE CORBION: I completely agree.  I know one thing our partners are doing and we're trying to do is not just work in our own chamber, but some of the issues mentioned about social justice, discrimination, about the rights of ethnic minorities, and civic engagement, democracy, and so we need to liaising with those civil society organizations and those stakeholders engaging the different fields as well and making the connections between the rights that we're fighting for and the rights they're fighting for as well and the communities they're representing.  Thank you. 

Any other comments?  A question in the back?  Thank you. 

>> QUESTION: Hello.  We mentioned the importance of hacking technologies and intrusion software and the hacking of the mobile phones.  I know ‑‑ I understood that there was 1 Legal Avenue to deal with the export of this kind of technology from Western Europe.  And I would like to know what are your comments on the advancement of the webinar enhancement.  I know it is quite active in this avenue.  I would like to have your comments on this. 

>> PANELIST: It is an area we are looking at for many years.  The problem analysis we've identified is the technology, the whole surveillance industry is operating in the shadows.  Like some of the business and human right standards, the surveillance industry has not been part of the discussions.  That is one area we're trying to look at.  The west agreement is one of the avenues.  Having said that, it is a nonlegally binding agreement as well.  There is more that needs to be done around that as well.  I think in terms of the work that we're doing, we need to be identifying, you know, who is purchasing and who is selling.  At the moment, even though there is much more information than ever before, that is still something that is not being done openly, particularly in terms of who is buying as well.  There are a few cases where we think it is the government that is buying, but then it could be a local authority.  It depends on who the adversary is in the context.  That is an avenue we're exploring beyond all the work we're doing on human rights mechanisms and in that respect.  We're also trying to look at the surveillance industry itself.  And trying to understand who are the actors and where there is leverage to be regulating, you know, those technologies that are being sold, particularly when they're being sold to governments that don't have, you know, where there is no rule of law, no legal framework to ensure that the way these are going to be used because some technologies have dual uses and can be used for local surveillance within a very restricted legal framework when we know that they're being used to target various groups in society that should not be subject to surveillance.  That is where it becomes problematic. 

>> ALEXANDRINE PIRLOT DE CORBION: Okay.  I don't see any more questions from the audience.  Maybe to wrap up, I would ask the three speakers to conclude with a few comments as to what they expect in terms of the change that they want to see next in terms of supporting the efforts that they're doing with each of their organizations. 

>> NIGHAT DAD: I want to see the voices that are not being heard, like the (?)  of digital rights, those that are not here, I hope this gives a space to voice their experiences.  And also, you know, I feel that even, like, organizations are working on digital rights, we are also learning.  We are learning how to respond to the challenges and issues.  So we at digital rights foundation are trying to create and establish more evidence‑based research, but also like Joana said, reaching out to the schools and universities.  Because when you are talking about charging the society, we really need to start from the beginning.  So, you know, like starting from schools and universities and our educational institute could change the mind‑set that we have been there for the case.  So yeah, I think it is not like ‑‑ it is not in overnight or a couple of years.  It will take a decade because it is so integrated in our society.  To challenge the society.


>> JOANA VARON: Yeah, I totally agree with Nighat.  That is why I am investing a lot in introducing those stories, the storytelling, like, getting the dating apps, analyzing the private settings, telling stories about how women have been harassed by not matching with someone but then the database are integrated women and Facebook, they connect them to Facebook because Facebook suggest them to talk.  That is the way to jump out of consent that was embedded in the platform to harass the person in another platform due to this integration of that database. 

Or in case that it was used to attack L.G.B.T. community, also.  So bringing the issues because surveillance is a word that is a scary word, a word that doesn't relate to everyday activities.  And the fact is that the way that the interactions that we are having with our digital devices and how the industry was built that's based on data, most of the time, we are all carrying our surveillance devices all over, no?  People don't relate to that.  Because in their knowledge, surveillance relates more to activists.  You know? 

So changing the narratives is one of the parts of the mandate (?).  And I think that could help us evolve in the strategies.


>> AMALIA TOLEDO: I don't know what else I can say after Joana and Nighat.  But we need more voices, more people working on this, definitely in the private sector, it can take a lot of advantages including way more diversity in their team from the decision‑making, but also from those that are developing technologies, also the government could be more inclusive and more diversity to think about the problems, to think about the solutions and not try to restrict our rights, you know?  And from our side, as you mentioned before, we need to keep learning how to make ‑‑ how to communication more effectively and easily all thesis issues, so we have more allies from grass root organizations and selectives. 

>> ALEXANDRINE PIRLOT DE CORBION: Okay.  Unless there are final comments from the audience, we're going to conclude this session.

In terms of what has been discussed and maybe some take-aways for all of us to take with us as we each go back to working on our different mandates and with our own principles and values, I think the intersectionality between all of the issues can't be ignored.  I think over the years, it has been good to see within this forum, but also others, those working on digital rights and online rights interacting with other organizations, maybe working on, you know, other issues that are emerging, be it gender issues, the rights of minorities, both on social yesterday, discrimination, civic engagement, and anything really related to engage in democratic processes as well.  So I think in terms of moving forward we need to have more of the exchanges and looking for solutions together.  So if a new bill comes out, you know, on data protection, interception of communication, whatever it is, we don't just have privacy advocates there advocating, but other groups that will be impacted because of the expansive provisions that are included in the laws.

I think an important take away as well is telling our story.  We tend to, you know, all be here at IGF and speaking within our echo chamber, we need to reach out to others to share the work we're doing, concerns, seeing if there are commonalities to work on together. 

So thank you very much to our three panelists and David Kaye who joined us earlier on the panel.  And thank you, all of you, for your presence and attention.  Yeah, thank you.  Have a good day. 


(Session concluded 12:00 p.m. CET)