The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
>> I think that is actually true. There is the issue with that and, I mean, I think one of the things too, but Christine is talking about the FIRST community. Imagine you find pockets. You find pockets of people who work together well, who trust each other and that's not necessarily the whole community, but you have that opportunity to create those alliances and that may be exactly the benefit of having more Civil Society not necessarily Civil Society, but going back to my example. You need aware. Certs and teams that are trusted and understand the issues, they maybe are collaborative and there are projects between different groups, I think that would be very, very valuable. It could really take fun if there's a lot of Access Now. It takes a lot to ramp up a big project and you're just now becoming FIRST members. Can it really take on that role? Maybe it is more partnering. It is more working together, finding projects and opportunities to become close to one another so that they understand, understand what are the neats of the Civil Society and yeah, what are the potential issues that come up when a new issue comes up. There's open channels of communication to solve the problem.
>> I would like to add to that. In the past few years, we have had more and more moves of having national certs as government certs. And what we are seeing in the mesh of the community because we also have as we call pockets. We have groups for the industries in standards. They are very worried about product security, about how to communicate and even the ability. So we also have a special interest group because usually, we are not talking about the security of my natural consulting. We're talking about different challenges in that. What you see is every time CERTs are moving more into some parts of some governments, they just don't show up more. They don't talk any more and the (?) is rolling. I think one of the points is very interesting to see is as they become more government centric, especially very focused on national infrastructures, they are not cooperating anymore. I would like to quote Marina. She just entered the room. I think a phrase that she said was very important because she opened the session saying that this is the FIRST time in the history that the government can provide history without cooperation. A lot of governments are failing to realize that. They cannot provide security for their countries. I can provide security for my (?). (inaudible) provide security for himself. Diversity needs to do security. You cannot have someone else providing security. And the only way you can help in security is through cooperation. This is something we seem to happen. So one of the things is that governments are worried and terrified. What is security, what is intelligence, what is internet security and they're not the same thing. And although they look like because you can go some information that CERTs have can be seen as intelligent information or information from (?), if you start treating everything as classified, you are not going to be able to provide security. I think from all this move and the shift, it comes a lot and you have a problem in trusting the CERT because it is confusing. If I have a national key that is below intelligence agency, who am I talking to? So if I have a national team that's part of the federal police of the country, who am I talking to? The police or the CERT? I think what needs to be clear is there's no right or wrong, but all governments need to understand what are the consequences of the choices that they are making and what can happen that where you put your national team you make a difference on how well it cooperates and how we can talk to different stakeholders. We saw a CERT. If you know who to talk to in Brazil, you can talk to us. We are focused more on cyber hygiene than national security. So really how can we improve the internet security. But we have a team for the federal government and they have a big roll in there. And we are in other areas that can be more focused. There is something that needs to be discussed is there is a place for different models, but depending on what you do, you have a fact on how you cooperate and how you share information. So I think that is something that everyone needs to think about and go back home and align and attend the consequences on some of the policies of the Internet.
>> The government represented. Would you like to respond to that?
>> Yes. It is something that has to be very well thought in the specific portion of Portugal. The national security in cyberspace tried to create different dimensions. The center runs, but also deals with cybersecurity as a whole. This cyber defense is a separate area. We cooperate, but we are or we try to be extremely careful not mixing up the row of the dimension. Sorry. My throat is also not prepared to speak. So it's something that is very crucial. We must rely on the democratic role of our sort to build significant roles in these areas.
>> Thank you for that. Sir, this whole question of, you know, since it's doing a lot of technical work and coming from Civil Society and the human rights field, I have a question about how is human rights into the processes and the work that your (?) do? I would like to start with you, Audrey. The way the human rights are coming.
>> AUDREY PLONK: Thank you. It is the start of the top of the company where we get recognized on declaration and part of our corporate social responsibility and have policies around in line with the declaration on what we think about human rights. So that's the foundational principle that we look at everything through. When it comes down to nitty gritty product issues, they, you know, what we're generally doing is trying to work with our technical stakeholders to find solutions to problems and we factor in a lot of different things when we look at that and most of this is governed by corporate policies around how we establish product review teams and how we manage them through the processed and how we engage externally with academia or CERTs or government or some of those roles are mixed in the last cases are dual headed. So that can be complicated for us as Christine mentioned that sometimes we have CERTs that is a government, but not a government or which part of the government. Sometimes it works really great and sometime its is hard to figure out. On the disclosure process, when we look at what the issue is, it is most likely impacted and what's the timeframe and when we need to get information out. You know? For us, a lot of the times we're fairly removed from the end user. In your case, you probably don't go (?) or state or firm ware. Most people don't that. Your original equipment manufacturer, but everyone in the Private Sector has a bit of a different role to play and bringing this forward, but we do factor in when we're going out to our customers are often the feel who make the devices. When I say customer, I don't necessarily mean end user. We look at who is using the particular product. We're looking if it's human rights group. If we know that group is particularly relying on a feature, on a set of technologies, then we'll factor that in. And so that's all. But there's a lot of discretion in those policies too because they tend to be different and the impact can be different and more severe and less severe. We do see more severe cases over time if you look statistically, more issues are found every year. The severity of the issues get a little bit greater. It is very hard to normalize the numbers relative to adoption to the technology. You can normalize for rate of adoption. It's hard to know exactly whether you'd be seeing a net aggregate increase or whether things would be kind of stable. And then there's other factors that come in to open source and working FIRST community when there's issues and people get integrated. But we do factor in human rights and it comes again from our foundational belief in the declaration of human rights and the idea which is in our social responsibility that trust in the use of individual technologies is part of being able to use technology.
>> Thank you.
>> CHAIRMAN: Thank you. So there's another topic I would like to address before we go to some questions. Recently, we are seeing certainly on active now research a trend towards criminalization of chemical expertise. We saw this recently with the incarceration of Eco security trainers that went to Turkey. We see efforts to making encryption, the use of encryption illegal. We're saying VPNs are banned in China and people started to be prosecuted for that. So I would just like to FIRST see if I can throw it to you and talk about particular challenges with Civil Society in relation to working with certain individuals.
>> Grace Githaiga: I do call (?) working towards a particular election. It shows the systems and electrosystems had been hacked. And as a result of that, you know, there is now a response that we need a practitioners build to determine who is an ICT practitioner. So that's a response. It's a reaction to a situation that has not been thought through. So I think one of the challenges is having responses to situations before people can pick through them and before people can look at the (?) of the systems or the people working on. Because we sometimes have people who have systems will tell you they're not hackers. They're just experimenting with technology and I think back then interference with innovation. So we're looking at that as some of the challenges, but it is something that's ongoing.
>> CHAIRMAN: Thank you. So what role are they pushing back against this criminalization of technical expertise? Anybody got a view on that?
>> This topic is very hard everywhere. In Brazil, there's also some talk especially because of crypto. (?) system that is safe to have access to contents of communication. We have been working a lot to try to educate policy makers on the consequences of some policies that might have worked in the traditional roles. That will actually have some very bad internet consequences in the (?) because I think especially with the crypto, one of the things we say is okay. So you say that this company or that company does not want to use their website and needs to provide some exceptional access or how much you want to rephrase and not say you're putting back. Okay. So the criminal will just create their own (?) and that's it. So you're not even going to have the metadata that you have. You're not going to know anything and you feed that in the water and join the investigation. There is that much that you can push in the area, but people now are using the apps that everyone is using. We get to the point that no one is safe but the criminals themselves. It is something that needs to be put forward and to have technical people in this discussion to really explain how the technology works and to explain that you know it's really not possible to have exceptional seas because you always have the human sector. You always can have the keeping souls somebody bribes and someone is telling information. We need to factor all the human factors. There is no way to do a perfect technology that we have both and from the technical perspective and from what we are discussing, it's much worse to try to prepare access than to have information that's already there for the investigation. I think we can go to a point that you get much, much worse and then the government and the legal system will have nothing of information. So I think these discussions have not really been put forward and I don't see a lot of the Technical Community being brought into the discussion to explain technically. You get more discussion of fashion than a discussion of okay. Let's evaluate where do we go from now? What could happen? Try to see really the things that could get worse because people are trying to do the right thing and try to catch the criminals, but you can really push them forward to do a place for themselves. So I think it is really we need to get at this stakeholders and I think in the moments that are really important to think about and to get everyone in the room and to try to imagine what can happen if we do something. These are some of the things that, for example, I did work for over 20 years with security. This is the FIRST thing that I did. I never thought about this. But I think it's the nice thing to have the different groups in the room and to really think about it. So I think it's really bad and I think that's really bad that you have other things that we mentioned. For example, all the unintended consequences of some governments. So it's because you have the judicial space likely we're working with name it whatever. So I think it's at a point really to see and that this could be worse than being open. You have to realize everyone has to realize that there is no way that you can stop piles and protect what you want. I think you want a good example of that that it didn't work. So I think it's really important for us to improve cooperation and it is very difficult. I would like to see more CERTs providing and sharing their (?). So I think this is something we can bring back the CERTs and try to get change to share more how you engage and time to speak up into that. I think that it would be wonderful if you could have more people from Civil Societies submitting presentations to the case study and blocking would be just wonderful if someone can go and present what you are trying to do well, but actually it was not good. So how can we improve?
>> I would like to add a couple of additional comments on this idea of exceptional access and efforts. It's not a new discussion. Obviously we've had it for quite some time going on 20 years now. But it does ‑‑ it is cyclical. We're back in the middle of it. I think at least one of the conversations we often have internally is the idea if you look at the preponderance of evidence out there, the idea that it is secret for so long, you have to be deluding yourself. Any kind of intentional weakening of the product or exceptional access is going to somehow maintain itself, it doesn't make sense in today's world. From a product perspective, our main issue is in protecting our customers and whatever that chain of customer is. So we're going to be most focused on how do we price and user amount is going to involve cryptography and most companies I can't say every single one, but big multi‑nationals recognize that cryptography needs to be trustworthy and we have policies in place of what cryptography we'll put in our products and we're transparent about that. So we looked at the academic community in particular to help us make sure that algorithms are not ‑‑ they're always vulnerable at some point, but we're doing the right due diligence before we put new things in products. That's why there's a lot of debate about national standards versus international standards and the Private Sector and what we put in our products. The other thing I wanted to say is some of the concerns particularly law enforcement have very legitimate and real concerns. I think for a long time, the Private Sectors had a position that we want to be cooperative, but we want to do it within a transparent legal regime that provides us with confidence that whatever we're doing, whatever we're being asked to do is for the right purpose and not undermine human rights and civil liberties or weaken somebody else's security. The legal regimes are on a world of a very, very different disregard. The very ‑‑ it's a source of tension between governments and there's been good discussions over the years about how to reform legal assistance between countries and how to kind of ramp up our multi‑lateral engagements to include the digital world that we live in, but it is a very slow process. The technology goes much, much more quickly. It is left with principles in saying this is what we want to and you can look at the environment of many companies who have, you know, take an issue with different governments with different requests they have gotten. But I do think that you have to factor in a lot of things and some of the things is the legal regime you offer is how transparent government is or is not and what the company says they're doing. You won't find anyone that I can think of in the Private Sector that thinks that weakening the product is giving access. If we're all relying on the same stuff, it makes everything weaker. That's why we have strong policies.
>> MAARTEN VANHORENBEECK: Thank you so much. Do we have questions? No? Okay. So we'll take questions from the floor then.
>> Thank you. Thank you for putting that session together. It's really been informing and engaging. I'm not from the Technical Community. Formally Civil Society with a private sector. One of the questions in the private sector especially emerging countries developing the common needs is the cost where it saves the benefit. It is always when it comes to small businesses and the change in the question, what is it that would be a good practice, a best practice and how do you keep investing to make sure you're performing to the optimal? You have not really seen and start putting out to standards. They have done a gad job of it, but when it comes to greeting standards, that's part of the conversation. My second question is speed, velocity and relevant information is something we seek when it comes to incident reporting and that's the larger umbrella of the conversation. We didn't see anything, anything at all come out in mainstream public from or public spaces about the (?), the conversations, the critical issues they're addressing. So how do you put this information? It means more cost for small businesses and takes business away from these countries. So their sharing of information you put in public domains and what is the network of networks that starts at the national or private? Thank you.
>> MAARTEN VANHORENBEECK: Get back to your step.
>> I think the challenge from where you're sitting in is there's lots and lots of companies that are putting lots and lots of notices. There is information overload and even the information overload all the time. And without the capacity really to have human eyes on everything. So the Private Sector is pushing out a lot of information. There are various organizations that try to work across a bunch of those big, big vendors to coordinate that information and proliferate it through a third system. At least if you're connected to a C‑CERT in your region, it would come that way. Or you can go directly to the vendor site, but then you have to know what you're looking for. And that's ‑‑ I recognize it's a big challenge and I think it's a big challenge on our side too is to say, you know, are we getting enough information out there enough of the right people because we can't notify 591 different countries. It is undoability.
>> I would just like to add the two question you put are one of the most (?). There is a (?) is talking about matrix of efficiency, security metrics in general are not good because it is very difficult to estimate the cost. So it's not just cost of people or hardware. So there is no good cost model. So there's a lot of (?) on that. The speed of sharing information is really complicated because every organization is the only one that can filter what they did for them. And especially small and medium business, they don't have a dedicated piece for IT, let alone for security. This is one big open area that needs to be discussed more and how we would tag on that. So I don't think anyone has that answer, but I think those are questions that are being discussed.
>> We saw a trend of policies that was sort of like government saying just close all this information to us and we will handle it. I think, you know, very quickly, it sort of becomes obvious. Really too much information and not all of it is actionable. What you really want is something you can do something about. If you're not in a position to do anything, having it really doesn't help you.
>> Thank you. I've been a board member and I tried to make sense of companies of how to deal with sub‑secure issues. One thing I find interesting is you say there's too much information out there on the CERT side, but actually small or medium sized companies lack information crucially. And so I was wondering: Is there some concept of rolling out the C‑CERT information on a more local, regional scale on some initiatives going on because I think it's exactly the opposite for the end user and scared. We have such a big divide. Thanks.
>> The practice has changed from place to place. We're looking at website from the national cert. We publish vulnerabilities of the systems. End user or a company may apply to receive alerts from specific incorporating systems or applications with machines and so on. So we publish vulnerability. The problem, the main problem is SMEs specifically SMEs. Big companies are more or less aware. They are fully aware. Sometimes the CEO prefers to address in advertising the company and product in security and they don't understand that after two years, they have a major bridge or a major failure of some security platform or some internal data breach because it is very significant. But in SMEs, it's a problem. The economy of our country as in many other places is very crucial. It's explained to make them aware that these are transformations that is requesting from them looking to different dimension of challenges. For example, even two very big ISPs have major problems. Some SMEs have services on the Cloud and they have been with major disruptions for almost a month. So it's challenging in our national search we try and we have agreements with fiscal Microsoft and we redistribute some of the vulnerabilities in an automated (?) to our user base. It's a difficult problem. It's something that has to be done because we don't have a vision of all the names of our society. We do a lot of training activities. Activities are a very big challenge. Just (?). January this year, we made an event on resin ware because we were figuring resin ware very quickly and when it came on the question ask of May, some big companies were prepared. The other ones were not prepared because we tried to pass the message. The media usually are not very favorable to these things. If there is a major crash, then the media come and they like to interview everybody. If you are sending warnings, it's very difficult. But we are here to try to do our best.
>> One more question. Most of these are in fact SMEs. What they need them to do is cyber (?) keep up with patches. Here is the latest version of software. So it didn't affect people that were up to date with patches. So because there were months that we did some for SMEs because it is a big problem and e especially because you can see the effect of the social media. So education is a big point. So now we have a lot of awareness materials for end users. We have portals for that and we prepare material for people to give classes to use and really we are preparing a lot of awareness materials for users and have now started some focus on SMEs. The FIRST one is ransom. So I think the point is really how to make them aware that they need to do some basic things. That update and back up. I think it is really a communication problem more than anything else.
>> From the Civil Society side, I wanted to add to that from the Civil Society perspective. CCERT is interested in looking at the vulnerability disclosures that are disclosed by all CERTs into this and reinterpreting those and reassisting those specifically for that impact in the Civil Societies. We will look at that and how you distribute that information as well. Another question?
>> (?) Georgia tech governance project. So you're talking about the development of trust between Civil Society and CERTs and you're talking about sort of a way internet security used to be done, which is there are small groups of vetted communities who shared information freely with each other and this whole model has been busted in the last 10 years because number 1, the national security to cybersecurity so you get national CERTs that are not going to share information with their enemies and you get more of a rule based regulatory process. It's not clear that the old model would scale to the level that we need. It's great that Access Now is starting a Civil Society CCERT. One solution is to start new non‑transnational, but there is more and more CCERTs. How do we scale the problem of cooperating. This is what I was wondering when I raised my hand a while back. I thought maybe the question was too complicated and we were running out of time. I didn't do that till I was forced to.
>> That is one of the major topics. It was a major topic during the two years of the practice form. It's a major topic of the best cybersecurity, but the thing is really information sharing there is also some kind of myth that you are sharing like secret information and you are sharing techniques and technical information. So one of the things that I think is a key issue that we find and that maybe someone can help is really how to minimize data. So what minimizes data? So most of the time they'll know if they minimize it enough so it is enough to share and usually because we are trying to put back privacy and perfect data. So there are some technical issues that are not necessarily because a CERT is intelligence or national errors that happens, but sometimes CERTs are over cautious. They don't want to share too much information. I would have some data I would like to share even for research, but we don't know how safely and automated ways. So at the end, we share results and statistics and we don't share the actual data. I know a lot of people say they need to be transparent with the data and they have data to share. But usually it is because we're trying to protect users. It's hard. It is not an easy question, but it's a question that is key to build trust and share.
>> It is really interesting problem because I do think the model has to change. Particularly if you want as you suggest to have an independent research with a broad scope across the world. Just look at funding of those. Funding is a really tough problem. It's been involved in a world wide Civil Society CCERT. Funding is a constant issue for us. Funding comes with assumptions about where that money is coming from and whether the operation is compromised or not, but also just getting enough, you know, to adequately start the operation is a real challenge. So any ideas? Come and see me afterwards.
Okay. I think we're just about at time. We might have time for one more question, if there is one.
>> If there are no other questions in the room.
>> For the record, we have seen governments not really getting internet government. They're not on the right side of net GOV. How do you see CERTs informing policy making? You see approaches to incidents where governments have punitive approach. It was re‑incentivizing best practices and upscaling the protection vulnerability protection. Do you see them contributing into policy making actively?
>> MAARTEN VANHORENBEECK: You are looking to meet. Well, um, I am in this position for 18 months. I am a professor at university. I was the founder of the Internet Society. So I am following the IGF since the very beginning and (?) since 1998. So I have a set of value that I try to apply my practice. Sometimes I am a bit concerned especially looking to some places in work, some countries where you United States ‑‑ you use as a mobile democratic processes where they are checking balances in different areas. They are changing very quickly. In my specific case, I am trying to practice every day what I believe are the best practice of the democratic role with the CERT and protect the society as a whole in the users. I have yet to note they are not directly related to your question. But in March ‑‑ in May 2015, in (?) issued a statement on lawful criminal investigation with respect to data protection. Basically weak encryption and (inaudible) strong encryption is crucial for e‑Commerce for E‑government and, of course for the protection of human right. I suppose after the three days they'll reach more ideas to apply to our practices, but the U.N. charter should be considered very well relevant thing in our daily practice. But that is some challenge going in the world. Let's wait and see.
>> MAARTEN VANHORENBEECK: Thank you. That's all we have time for. Would just like to thank the panel. There was a lot of content in there for me. So I hope you got value out of it as well. Okay. Thank you, everybody.