IGF 2022 Day 4 WS #454 Accountability in Building Digital ID Infrastructures

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR: Good morning, everyone.  Welcome to Workshop 454, Accountability in Building Digital ID Infrastructures. 

     My name is Bridget Andere.  I'm Africa Policy Analyst at Access Now, and I will be moderating the session. 

     I'm going to let our wonderful panelists introduce themselves.  And I would like them to tell us their name, what they do, where they are based, and one interesting thing about themselves professionally and one interesting thing about themselves that is not work related. 

     So I'm going to start with Mercy who is joining us online.

     >> MERCY SUMBI: I thought you were going to start with the people who are present so I could buy myself time to think of answers to your questions. 

     Hello, everyone.  My name is Mercy Sumbi.  I hope you can hear me loud and clear.

     >> MODERATOR: We can hear you, great.  Yeah.

     >> MERCY SUMBI: Okay.  Perfect.  I'm based in Nairobi.  I'm a litigation lawyer and most of my litigation centers around digital rights, human rights in AI.  Basically cases that are intended to define our digital future.

     One interesting thing about me professionally is that I'm a geek so, you know, this field comes naturally for me.  It is like a passion, to pursue any the internet governance project for me is a passion. 

     One thing interesting about me nonprofessionally that I think I can sing but I can't.  But, you know, I still think I can sing so yeah.  That's about me.

     >> MODERATOR: Thanks, Mercy.  And I believe everyone can sing, so, yeah, if you think you can do it, you can definitely do it.  Laura.

     >> LAURA BINGHAM: Hi, I'm Laura Bingham.  I am lawyer by training and my background is in mostly litigation and advocacy on nationality rights, borders, documentation of identity and statelessness.

     I am currently the Executive Director of the Institute for Law Innovation and Technology which is based at Temple University in Philadelphia in the United States. 

     And something interesting about me professionally.  I guess I mean, you know, I don't know, the only thing that comes to mind is that I always feel like a novice when we are talking about technology and internet governance and digital questions.  That was not something that was ever a part of my training or really professional interests for a long time, but now it has become a passion.  So I always feel like a little bit of imposter in conversations like this. 

     Personally, I don't know.  I can -- I can touch my arms behind the back even though one arm is shorter than the other from a really bad break when I was little.  Maybe that's, I don't know, is that interesting?  I'm going to stop there.

     >> MODERATOR: That is interesting.  Thanks for that.  Yussuf, please go ahead.

     >> YUSSUF BASHIR: Good morning, everyone. I have a very interesting chair here. 

     My name is Yussuf Bashir.  I work with -- I'm the Executive Director of Haki na Sheria, an organization based in northern Kenya.  We work on human rights, we work on statelessness, we work on nationality issues.  As well as a bit on climate change.  Professionally I am trained as a lawyer, so I practice as an advocate in the High Court in Kenya.

     Something that is -- I think the question about something interesting professionally is very difficult.  I would say that I came through to working on these issues of digital rights and digital IDs because of experiences on the regular IDs in terms of discrimination and things like that. 

     On the personal front, I have recently got into wildlife photography.  And it's going really well.  So if anybody is interested in having a conversation about that, we can talk.

     >> MODERATOR: That's going to be really hard to beat, I think.  Thomas, please try.

     >> THOMAS LOHNINGER: I will fail, Yussuf is the best. My name is Thomas Lohninger.  And I'm based in Vienna, Austria. 

     The interesting thing about me professionally is that without Star Trek and anthropology I would have never made it into the digital rights field.  I've been there for around a decade. 

     I'm Executive Director of Epicenter.Works, an NGO in Vienna.  And I'm also serving as Vice President of EDRI, a network of over 45 digital rights and privacy organizations in the European Union. 

     The interesting thing about me personally.  I don't know.  I have taken up swimming.  Best sport ever.

     >> MODERATOR: Thank you, Thomas.  Great to have you.  Elizabeth.

     >> ELIZABETH ATORI: Well, being the last one is a little bit difficult.  But my name is Atori Elizabeth.  I'm based in Uganda.  I work for an organization called The Initiative for Social and Economic Rights.  So I have had about five years experience advocating for social and economic rights. 

     And so my practice as an advocate is very much pushing for the right to education and health and business and human rights.

     One interesting thing about myself in terms of work is that for the digital rights sphere the thing that really did push me was working with older persons in Uganda.  And it is from there that then I started to question the challenges that people face in terms of access to social protection.  So that's what propelled me to this field in 2019. 

     And interesting thing about myself personally.  Let me just clear my throat.  I think the very one interesting thing is I do not love chocolate.

     >> MODERATOR: That is definitely very interesting.  So we have all come together here today because I think if not everyone in this room most of us have experienced digital ID in a very interesting way.  And we are trying to figure out a way, strategic wise and policy wise to ensure that the digital ID systems that are being deployed all around the world remain human rights respecting. 

     And trying to figure out through the session how to hold people and just generally stakeholders accountable when it comes to digital ID procurement and digital ID processes.  Not just in the region and, of course, my interests would be in the region because that's where I'm based and that is where my work is based but around the world.

     We are hoping that the session will be as interactive as possible.  We will start with the panelists, but please feel free to save your questions to the last 30 minutes.  But if you really feel like you need to intervene, just put your hand up and I will make sure to come to you in case you think that it is relevant for just that moment and you can't hold it until the end.  I'm happy to do that.

     So let's get started.  And I hope this is going to be useful for all of you as it will be for us.

     The first question we want to dive into is what are the obligations of corporate actors participating in the digital identity and digital public infrastructure markets including, of course, multinational enterprises to perform human rights due diligence which is something that I'm sure all of us have been talking about not just with digital ID but also in other facets of digital human rights. 

     I'm going to start with Laura who can just give us a little bit of insights into what your experience is with this and how we can make it better around strategy and policy and advocacy.

     >> LAURA BINGHAM: Thanks, Bridget.  Okay, so just a little bit of background.

     My work has mostly -- my work on digital ID has also mostly been in this region.  And in relation to corporate actors that are not based in this region that are mostly based in Europe and North America.

     And I come at this as a lawyer and litigator and from a more enforcement perspective as opposed to thinking about regulatory frameworks in the abstract.  And when we think about obligations, you know, those have to do with specific actors and specific actions and effects, you know.

     And those things need to be defined before we can really start talking about what are the obligations and how do they apply in a specific case.

     And that is not really what we find in the realm of human rights due diligence, not as yet.  So there is really a fundamental problem at the outset I would say in defining regulations that has to do with attribution or understanding who is doing what and that is because at least in part of really a lot of obfuscation is hidden from public view in how digital ID systems are conceived, at what moment in time, who are the actors involved, who are those stakeholders that involved in that conversation.  Usually it's vendors from multinational organizations, corporations and state actors. 

     Tender processes and procurement more generally can shrouded in national security protections, contracts that dictate specific terms, who is doing what, you know, just the basics are not public.

     So we have a problem of evidence of who is doing what so tends to obscure what the obligations really are.  But we do have information about actions, the general sphere of actions that are taken by multinational enterprises.

     And the one thing that I would point to here and then let colleagues on the panel elaborate on is that most of the time the client is a state actor.  Is a state, a government.  And I think that is really important for human rights due diligence when it comes to digital ID technology and software because it is just fundamentally different from selling something to a 20-person office in a private enterprise, for instance. 

     I mean you're talking about an entity that has monopoly on violence.  That has control over police powers, that has a substantial monopoly on all administrative information and can have control over a lot of private data.  We know that.  Especially in a space like this.

     So that ought to mean in terms of the obligations that there is a real distinction.  And there needs to be a much, much higher threshold for the conduct of human rights due diligence for enterprises entering into the identity and security space.

     And the last thing I might say is we do know something about the effects.  While there is an attribution problem and there's often a real difficulty in causal links between action and effect, and we can say a lot more about the evidence of that and how we need more. 

     But we know that these are dual use technologies even though they are not defined as such in a lot of export control regimes.  You can just look at the case of Afghanistan, for instance.  One thing that a lot of folks have pointed to or did around the time that was a civilian digital ID system that was used for payroll for paying government employees.  When the digital ID system fell into the hands of the Taliban, it was immediately converted into a targeted killing platform. 

     So I mean it is obvious that these technologies taken as a whole are a dual use technology that they can be used for operations like that.  So I think that is another principle for human rights due diligence that really needs to take hold and be understood as a cross-cutting obligation.

     >> MODERATOR: Thanks, Laura.  And you bring up really interesting points. 

     Yussuf, I know that your experience has been interesting and unique in the work that you do especially around double registration. 

     So I'm wondering exactly what has been your experience when it comes to HRDDs around these technologies that being deployed, especially around people in conflicts and people in refugee situations?

     >> YUSSUF BASHIR: Thank you, Bridget. 

     I think Laura does a fantastic job of creating the background for the conversation around who gets to suffer the most whenever these digital IDs are rolled out.

     On the one hand, there are these -- there is a combination of state power with corporate power.  And on the other side of the equation are these communities that are living on the margins, these communities who have in most situations suffered because of either numbers or historical issues suffered marginalization and inadequate access to resources.

     Therefore, these technologies that are being rolled out, whether it is, you know, placing these individuals in digital databases, they are not really meant to facilitate, for example, the way of life, you know, the economic situation which they have probably developed, you know, for millennia.  But rather these technologies are cutting edge, but they are meant to control and to figure them out, you know, and place them in a certain box. 

     So the situation of double registration I think illustrates very well the effects of rolling out, you know, cutting edge technology on communities on the borders.  And it also illustrates to a certain extent I think something that is going to happen more and more because of challenges such as climate change.  It's already happened, for example, in the border between Kenya and Somalia. 

     But it will I suspect in the future happen more and more because more and more people are on the move because of their livelihood being destroyed by the impact of climate change.  In the 1990's because of the drought in host communities living on the border attracted by the facilities being offered to refugees in the refugee camp went to the camp to seek food rations, healthcare and such. 

     And they ended up being put on the refugee database and have had decades trying to get off that refugee database.  Essentially finding themselves in situations where they can't access government services that they are entitled to as Kenyans and becoming essentially foreigners in their own country. 

     You will find that the rollout of the digital database from that particular camp had been before it's rolled out in the rest of the country.  So provides a good analogy for the fact that a lot of these are being rolled out in places where there is little accountability.  And the communities may not be as organized in terms of pushing back or resisting.

     So therefore you will find it was done there before the rest of the country.  And therefore now it has taken, you know, a long time through advocacy by an organization I work for, litigation from collaboration with some of the civil society organizations that are here today.

    I think -- now I think the tide is kind of turning that the first batch of individuals were deregistered late this year.  But still about more than 20,000 individuals remain on the database, and it continues being a problem because people are still being registered in a lot of ways.  Droughts are still raging so people are still forced to look for ways to survive.

     And therefore I think it is good that we are having conversations around these systems and the kind of people who are affected by these systems.

     >> MODERATOR: Elizabeth, I know that ISOC put out a really interesting report, what was it, about two years ago on the effect of digital ID deployment, especially on People with Disabilities and people who are accustomed to certain contexts.  And I know perhaps you would have some comments on human rights due diligence, we would like to hear them.

     >> ELIZABETH ATORI: Thank you very much. 

     And yes, indeed, we have done extensive research on the exclusionary effects of implementing a digital ID system in a country where the systems that are supposed to be supporting these digital ID system are either inexistent or weak. 

     And if I might be able to just give one example that we clearly identified in that report which is just Awareness to Die.

     We did happen to publish the report with our partners from NYU and also from Unwanted Witness.  And we clearly highlighted the best of the research that the digital ID system in my country, which is Uganda, does speak to a much wider identity system which is also supported by the national ID system.  So it is not just an ID system might be to support a social protection alone, but it is much more inclusive.  As a country we have decided to tag mandatory access to social services to a digital ID.

     So you have a system that then becomes the single system of source for social protection and health education.  What then that means is that over 16.8 million Ugandans that lack IDs because of the systemic challenges because of issues around access, because of challenges that my colleagues have spoken about around the cross-border populations.  You have about 16.8 million not having access to a national ID and actually being excluded from accessing the very basics which is health and education. 

     It might sound very easy to say, but in a country where the majority of the population lives below the poverty line having to access public health and education is really the difference between life and death. 

     So what the report does highlight is the fact that we are eventually leaving people behind.  But not just leaving them behind, we are actually dooming them to death because then if I can't access public health I can't access education, what then happens to me?  And that speaks to the duties that corporations have.

     Not only multinationals but even within our system.  Yes, the government does control the national register, but whilst we are providing services like social protection we have a system where persons above 80 years are given about 25,000 Ugandan schillings, that is about I would say less than $30 a month to be able to survive and to be able to support them to maintain a certain level of livelihood.

     But the way the system works is that a private entity is the one that then disburses these funds but they very much relying on the national ID system for authentication for issuance of the funds.  And if you are unable to provide authentication, let's talk about the biometrics in the country where most of the population is peasantry, what you have is the situation where the fingerprints are not easily read. 

     The way they are designed and implemented affects the population and as such leads to exclusion.  What obligation do the corporations have in this case?  An obligation to ensure that the systems are inclusive, and they cater for everyone. 

     Our national ID systems or digital ID systems that are supposed to increase access to the social services and ensure that those that do not have then are able to get, that are supposed to lead to more accountability from our governments.  But that is not what is happening on the ground.

     And so you find that the policies that have been put in place for national ID or digital ID systems are very removed from the realities that people face.  And as such, there is a real need to provide access to remedial mechanisms to even speak to have a discussion around what are the alternatives.

     So we foresee that we hope to have a society where everyone has a digital ID, but do we see that day happening in a country like my own like Uganda.  It is very unlikely.  What then do we do to the particular persons left out and cast out but then seemingly become settlers and they are unable to gain access to the much needed benefits that they require.

     So I believe these discussions very timely, and I would like to hear what other people have to say about it.  Because more and more we do speak to the need to have digital ID systems.  But we forget those that we leave behind.  How then do we include the disabled?  How do we include the tribal minorities?  How do we include the cross border communities that even before the system was introduced were already discriminated against.  So if it is a system that is grounded in a system for security purposes then by its very design it is designed in such a way it is meant to exclude those that are seen as people that shouldn't be included.

     So and a good example is between the border of Uganda and Rwanda there is a particular tribe that has very similar appearance features.  And you will find that those particular people now if they go to register for a digital ID, they will tell them you are not Ugandan enough.  But what is Ugandan enough?  Why are we removing nationality from people and basing nationality on a digital ID system yet already they already have these particular rights that they should be enjoying.

     So for me, I would say the roles around corporate actors for digital ID really speak to accountability.  And accountability goes beyond just ensuring that the systems are up and running and are being mass enrolled to ensuring that every single person that has to get a service based off a digital ID then that person should be able to access that service with or without a digital ID.

     Thank you very much.

     >> MODERATOR: Thank you, Elizabeth.  And you bring up something interesting about how these -- systems are designed to exclude. 

     And for me, that brings a feeling of just how colonialist the structures are. 

     And particularly I think the communities that you have worked with, Yussuf, what would you have to say about just the feeling of the structures being colonialist and the tech and how they are being deployed being colonialist seeing as they are designed to exclude and set people apart?

     >> YUSSUF BASHIR: Well, I think the colonial enterprise I think first began on the African continent through the use of pencils and rulers.  People took out the plan of Africa with pencils and rulers and drew straight lines and they do not take into account how communities actually live. 

     And immediately what happened was colonialism became an enterprise of domination and extraction that forced people to be considered to be a threat to people living on these borders.  And the control of them is always because of the issues of lifestyle and the way they moved around.  Of course, the   colonizers were never comfortable with the communities because they didn't know where to find them.

     And you will find that even the manual regular ID was rolled out as a means of, you know, as a form of essentially some kind of a visa system before we have this modern visa system so that the groups could be, you know, asked to produce them whenever they are moving on demand. 

     You will find that looking back at the Kenyan one or other ones they used to have the huge things you wear around your neck which details your tribe and name and that is how the ID process started.  South Africa, the apartheid regime was advanced in this.  These are systems that are very closely linked to control, domination.  As opposed to the way Africans or the way communities actually live and practice their lives day to day.

     And the whole process of even requiring individuals to submit to have their biometrics and pictures taken without option because you know, if you don't submit your biometrics and your documents, then you are shut out from government services.  You don't access education or healthcare.  Essentially you have no option but to comply.

     Now, transition to digital IDs you can draw parallels with that.  For you to access government services these days, register for a document or passport or anything you have to have an ID number and online system that you have to go through.  And therefore now this new system that is being rolled out in largely similar way to the other one, the way it's rolled out is not about facilitating the communities that are always not been standard regular, you know, the ones that people like, you know, these are the people on the edges.

     And these individuals are the ones now who are again suffering exclusion in this new enterprise.

     >> MODERATOR: Thank you.  And Elizabeth brought this up, and I'm sure it is on our minds all of us really that is a lot of the time when the systems are being deployed the biggest, the biggest justification and the biggest reason we are being given constantly is security.  Which, of course, then leads to the aspect of surveillance and a lot of data collection and, you know, generally just having people constantly being watched and having people being monitored. 

     Mercy, I would like to come to you for a second and get your thoughts on how, you know, within policy infrastructures and within the law how can we move away from structures that are inherently, you know, used for surveillance in a way that is disproportionate and ensure that we are still achieving what we need to achieve with regard to deploying social security measures or even the security measures being constantly spoken about but without disadvantaging people?

     >> MERCY SUMBI: I think coming from my perspective and putting in this context operating in the world of paranoia where you imagine the worst case scenario and then you litigate based on that, that is what you are trying to avoid.  It is not advisable to start from a resigned point of view that we must have these systems.  I think the challenge begins with the question should we have these systems.  So you have to question even the intention to have the system.

     When I look at the litigation that has happened in Kenya around the digital ID and other forms of databases generally, I think some of the things I can draw out of it, a country has to have a system and you must be able to define what those are within your contexts. 

     It is very easy for, you know, us to talk about a car and try and, you know, put in the European context and see how that wouldn't work and then miss out on India is never going to be, you know, Europe, like India's context.  In India's constitutional background is there a way you can imagine that the constitution would allow.  In our case it comes to start with the constitution what is the basis.  Is there a necessity for it. 

     And if you can't show the necessity for it, it is tracked down at this point so you don't have to always work backwards.  You don't have to -- I know that is what it did to us is that we were chasing a train that really left the station.  So it was that much added to put the Jeannie back, but it doesn't always have to be that way and that is where now national security comes in.  One of our minimums has been defining what international security looks like.

     Because it is one of the most abused exception when it comes to limiting human rights.  So we had good cases where the courts have put their foot down and said if it doesn't look like this then it is not international security, right.  So security in the streets, people stealing, robbery, that is not national security.  That doesn't rise to the occasion.  It has to threaten the integrity of a country, you know, threaten human rights on a large-scale like another country invading you, your borders, whether it is a digital threat or a traditional threat.  It just has to rise to the occasion of what is defined.  I think that is one of the things I think that is important.

     And maybe if you are coming from a litigation or public agency point of view is to pursue cases that push the courts to define what the minimums look like.  So we are agreeing that in a way one way or another the right to privacy is going to be affected, okay.  But what must the system look like?  What can't be abrogated from would be the starting point.

     And then you quickly realize it has to be a joint effort.  There is always one arm of the government that is pushing for the ID more than the others.  If you are unlucky, you get the legislator and executive conspiring, so it is like two versus one.  But usually it is the executive with a very specific agenda and half of the time you are not sure what the agenda is because it doesn't read transparent. 

     So my lesson learned from the field is that it can't be a one department.  You have to approach in terms of from all angles.  Right.  So it is the parliament that that gives you the laws on the minimums which have one defined over years maybe by the judiciary or just by the culture of a country.

     So you push for advocacy through parliament to keep restricting that space more and more.  Not the human rights space but the space for the government to misbehave to keep legislating on it.  Because one of our minimums now has become there has to be a legal framework.  If you are setting like we are going to have a digital ID framework, there has to be a legal framework.  What does it look like?  So advocacy on that end.  And then with the executive is to also trying to understand what they are trying to achieve.

     For a very long time our assumption was the social surveillance was what was important.  And I'm guilty of that.  Like I always assumed it was first security and then international security.  And judgments came out and they are no longer returning for national security.  We assumed it was social surveillance.  But now it's looking differentiated.  It's looking -- different like financial surveillance.  Who they are partnering with is no longer just tech companies.  It is now mobile money companies so suggests that the goods have moved and perhaps our advocacy agenda hasn't moved with the codes.  Do we understand whatever is trying to implement the digital ID system is trying to achieve and are we able to meet them at their point of mischief, yeah.

     So I think for me I have a rough idea of all of like when you step back you can paint the picture and see the different actors that you need.  But at the top you always need the judiciary and I'm not saying this because I like litigating and argue.  I say that because you have to draw the line and say someone has gone too far.  If you want to collect biometric data you can't do it for just we want to distribute relief food.  You are going too far.  So your efforts and advocacy efforts cannot be towards one arm of the government.  They have to be across the board. 

     Because everyone has their role to play and sometimes the goals are shifting too quickly.  Before you realize it, before you rush to court something else has happened, yeah.

     >> MODERATOR: Thanks, Mercy.  And you made a really interesting point that the goal post is always moving.  Constantly moving, and that makes advocacy a little bit difficult every day.  And that is something that I would like the room to think about as you know as we go along what is working, what isn't working and how do we need to change, how do we need to -- how could we get to be a little more nimble with regard to how we respond to things. 

     Thomas, I would like to come to you on the same strand and just gather from you, what your experience has been bearing in mind that, you know, compared to the other panelists your context is rather different.  How has your experience been moving away from surveillance and military kind of processes and infrastructures?  How has that been for you?

     >> THOMAS LOHNINGER: Thank you.  First I would like to start to say that the debate we are having in Europe about it is extremely ignorant about the perspectives from other areas of the world.  So almost as if Europe has nothing to do with what we have heard before.

     And we are in the middle of a huge reform in Europe.  We are building a quite important system, and the backdoor of which is the COVID pandemic and the experiences we made in that time.  So there is a new huge push for more e-government services. 

     In Europe we also had the COVID certificates to verify that you have been vaccinated or recently been tested.  And that created also a stronger proliferation of digital systems in everyday interactions.  Like In many countries you couldn't go to a restaurant or theater without showing and using these systems.

     And in June of 2021 the commission proposed new legislation, the so-called reform and it is one unique harmonized system for identifying natural and legal persons and verifying attributes about them.  And you can also use it to look into services like Google or Facebook.

     And the purpose of the commission was really to push this in the whole European market with 80% of people using it by the end of this decade.  It is important to stress that it's an open system.  We don't know what attributes will be in there.  Could be your COVID 19 status, HIV status, information about the family situation, family status, age and driver's license, but also the university diploma.  And who can check the system is open.  We could see this in government uses and in border situations when you want to enter the country and leave it as well as, of course, the industry interests. 

     Basically everybody with a customer relationship system wants to participate here.  So we are talking about e-commerce but also talking about advertisements, targeted advertisements.

     The privacy repercussions could be huge.  And we started to work on this quite early on with the unique experience right at the beginning.  We came out and criticized the proposal, it was the commission who wanted to talk with us.  Usually it is always the other way around.  They wanted to talk with us because they expected us to be happy because for them that is the one point I agree with, identification is something that we cannot let Google and Apple do.  And I'm for -- I mean there is a three-fold thing here.  Of course, an identity is a human right and part of our dignity.  Nobody can take this away from us.  Identity is the proof of the fact, and I would much rather have governments issue that than a private company.  Of course, the legal framework as we threatened the fundamental rights protections in the systems are hugely important.

     And just to touch on one important point, there is a lot to be said about the privacy by design and how you can use the system without being tracked and it can be unobservable.  So one actor can see everything about our lives.  Every type of data no matter if it is financial or health or criminal records, we want these things to be separated.  And this one key could allow a panoptic view from every restaurant visit to every government interaction.  But a very important pillar for our work in this field is also non-discrimination protection.

     Early on we stated no matter how good the system will be there will be people that choose not to use it or might not be able to use it.  Senior citizens is one big group.  Also undocumented migrants another huge group.  And if we have this as precondition to access to government services and labor market and other goods and services, we will create huge exclusions.

     And on Tuesday next week the Member States will adopt the version of the law and it will not have the safeguards or discrimination protections.  The European parliament is still negotiating and there the chances are better.  From all four committees across the political spectrum we have seen the nondiscrimination provision being tabled and supported.  From communist members of European parliament up to conservatives and even right extremists everybody is in support of a nondiscrimination provision.  We have yet to see if they can get it adopted in the final law.  Tells you about the double standards. 

     For the elected representatives that seems to be a precondition for even starting a project like this.

     >> MODERATOR: That's very interesting hearing about nondiscrimination provision in a law like this, it would be useful in context to ensure that people are included in public and social services. 

     Elizabeth, I would like to come to you.  How do you think that law and public policy can come in to talk about alternatives and how can we find the alternatives within the systems?

     >> ELIZABETH ATORI: That's a tricky question especially in my context where most times law and policy is not even implemented or if respected then there are real gaps around the laws.

     But from the days the digital ID system in Uganda was referred to as a national security project and seen as another point for the government to give or to control the population, have any illegal residents and even now if you are looking at the way that they are implementing systems across the board if a person who I have spoken to accessed to social services and that is majorly for Ugandan systems but use the same system but they are set up to exclude other populous that live within the territory so that is also questionable. 

     So that speaks to how we utilize the information we collect and how within the digital ID systems and what amount of data is necessary in the particular systems.  Next year we shall start the national rollout of the digital ID system and now going to go into another mass enrollment where essentially my ID or national card expires this year. 

     If I'm not readjusted then essentially I would not be able to access any services. This particular issue has not changed.  I doubt is the change because of the advantages that it has.  We have seen the system linked to other particular processes and other particular exercises in which they enroll mass data for particular users.  Like for -- whether it is for elections or for people to just register a financial accounts with the bank and to use that for surveillance in terms of okay, how much money is coming into your bank, where is it going.  And yes, that speaks to security, that we need, but we need to ask ourselves in terms of the legitimate purpose of these laws and policy they have put in place because for the context of Uganda we do have a law that speaks to mandatory ID for all Ugandans and also speaks to the consequences of not having that ID which does include access to social services as I had earlier mentioned.

     So these laws and policies in my case would need to be revised to carry out the human rights risk assessment of our ID system.  No one is doing that.  The government and the people that are running the systems are not as interested in the human rights impact of these systems as they should be.

     So then the digital ID system then does not become an instrument for inclusive development and becomes an instrument that does enforce the development that we need.  If we look to achieve the SDGs and all of the global goals we have set targets for it is unlikely given the systems and given the laws and policies that have been put in place that we should be able to achieve that.  And because of this, my organization which is the Initiative for Social and Economic Rights and other partner organizations did bring the matter to courts to litigate knowing well that the foundation of the exclusionary affects of digital ID in my country are best in the law of it and in the places that have been passed so we need to shift in from a place where the digital ID is majorly for security and to move as you rightly said to a place where it is for inclusive development. 

     The way we get there will need us to revise our laws.  It will need us to undertake an assessment of the impact of the system so far.  So now that we are adopting other -- a new digital ID the next year, what have we done to assess how far we have gone in the past years and that is something that hasn't yet been done in my country and that is unlikely to be done. 

     So what we have then as a country that is ready to move on to another phase but that hasn't sat down to think through what challenges did we experience in the first phase and how do we remedy the challenges.  And that right there speaks to the political view.

     Because as we know, the digital ID system in themselves don't work in a vacuum.  They work in systems of government.  So if the government, the actors responsible and ones behind the system are not willing to ensure that the system is as inclusive as it should be or to provide alternatives for those that are unable or left out, then we will not be able to move to a place where the policies and the laws are apart from security causes majorly to human rights causes. 

     I also believe the other thing that would push us in that direction is the involvement.  The direct beneficiaries of a digital ID system as a community are the end users and we need to find a way to offer them platforms in which they interface with the provides of the system to offer a remedial mechanism that works that is able to provide effective remedy that is timely. 

     Because as we know for issues that affect social and economic rights health, education, these are ongoing violations.  So the more we sit on it, that the worst violation gets, and the more people are affected.

     So it is really an issue that would be inclusive.  It involves both the actors that are behind the system for them to have the political will or to change the system to make sure it is inclusive.  It involves the users of the system to get involved to participate to have to have a voice to state where the changes need to be, and the platform needs to ensure that whatever is provided is actually able to be implemented so that goodwill to be able to move towards a more inclusive system is actually what we need in this case.

     >> MODERATOR: I'm glad you mentioned that because that gives me the opportunity to segue into what I wanted to talk about. 

     When thinking about accountability, of course, we have to think of who is involved, who stands to benefit.  Apart from the obvious states pushing for the IDs, who else is involved in pushing for the structures and these technologies that we don't know about?  What do they stand to gain? 

     These are some of the things that we need to ask ourselves.  And I would like to come to Thomas to get your thoughts who is to be held accountable and what do they stand to gain.

     >> THOMAS LOHNINGER: There is a multitude of interest groups involved.  Of course, it is the government that usually launched the systems.

     It is the vendors that build them.  Not just vendors but existing infrastructure, existing platforms.  Telecom operators have projects like these like the registrations and already hold the identity data of their own customers.

     And then there is also in the financial sector, for example, the field of credit scoring where we also have huge networks or identity providers whose main job is not about giving the score about the credibility but proving that you exist as a person.  Called positive data which is a highly questionable practice under GDPR.  And ultimately also the international works like the World Bank.  But I think access now knows far better like about that particular actor.

     And one ally that we have in this debate that we also relied on in our work was academics.  There is a very stable and emerging field on solutions to get the privacy by design in the systems and some of the academic works make it into ISO or W3C standards.

     And there is also called self-serving identity movement which does provide a few of the solutions on a digital logical level but there is also an unhealthy relationship of the SSI community with blockchain technology and venture capital.

     So that means that those allies have to be probably taken with a grain of salt.  I mean the EU fancies itself as a global rule maker so whatever their system might become, and I have -- I am in no way confident that it will be a system that we could build upon and be proud of as Europeans.  And also respects the values that we think we stand for, that we will know middle of next year hopefully when the legislative process is concluded.

     But there is no one size fits all.  I think a common technology stack that adheres to many of the principles that we just heard also in terms of allowing for inclusive and participatory rulemaking I think that that can be very helpful on a global scale.  But ultimately, without the right regulatory framework, control who can be the relying party that verifies something about you.  What are the rights for me as a natural person not to use the system and still being a citizen, being also a workforce in the country.  You know, as an employed person.  These can never be solved on a global level.  These are always specific.

     >> MODERATOR: Good points about specific problem solving, not a one size fits all.  Laura, Thomas has put you in a little bit of trouble here by mentioning academia.  I would like to get your thoughts on this as well.

     >> LAURA BINGHAM: I don't particularly yet consider myself an academic.  Maybe I will settle into it, but I will do my best.

     I mean so listening to everyone's remarks up until now, you know, I was thinking a little bit about kind of the longer term trajectory and the involvement of academia and ultimately the private sector, you know, going all the way back to thinking about collecting information about the general electorate back in the 1940's, 1950's, 1960's in the United States. 

     This was an enterprise incubated by the U.S. government and deployed in the Viet Nam war and tested there and how much information can you gather about a population and how you can use that to manipulate and influence social and political structure.

     That enterprise then moved into the private sector.  And you know, now here we are talking about how Google and Facebook shouldn't be the ones to verify, it needs to move back into the government where the idea originally instigate.  In some cases exporting of some of the state projects to academia and think tanks and the private sector has not turned out well or has resulted in some of the huge structural problems we are here talking about.

     But you know at the same time looking at access for researchers who can really unpick how the systems are operated, operating in practice, some of the issues around exclusion by design.  And how biometric matching actually occurs.

     These are protected by business interests.  The algorithms that encourage virality often enable research on platforms for academics who really want to help to solve the grand social problems of our day is a huge question.  I do think there is a role to play there, but that is part of a bigger conversation about how do you define the power that is moving back and forth between private actors and the state.  How do you actually -- what is that power?  Where does it sit? 

     And, you know, ultimately it is a question of a rule of law and sovereign people exercising power over the people who are charged with organized government.  So we need to think about checks and balances across these spheres in a new way.

     And it can't just be a hot potato that keeps bouncing back and forth between the private sector and then they are the bad guy and the state.  I think that that is a fundamental question for governance that digital ID systems pose.  And, you know, we are kind of touching different parts of the elephant here.

     But I think an actual radical revolutionary conversation about how do you put things in check really needs to be had.

     >> MODERATOR: Thanks, Laura.  At this point I would like to throw to the room.  And we will take three or four interventions, questions. 

     As you think about this, I can see a lot of faces in the room that are, you know, in advocacy and policy and I would like you to think about what has worked for you, what hasn't worked.  What do we need to stop doing?  Where do we need to focus our energies? 

     Please.  Right there. 

     >> AUDIENCE: Hello, everybody.  Interesting discussion.  And great to have the variety of perspectives and also the European perspective which is interesting. 

     My name is Jose and I have been doing a little research on this.  One interesting thing that we could think about is what could be a vision or what could be a strategy objective of civil society and human rights oriented researchers, et cetera.

     And I think that it is clear that we have to have a -- perhaps we need to develop a positively worded or constructive objective that can be a pushback for this or push which is the push for mass digitalization, which as we know is driven by businesses, by interested government, et cetera, no. 

     So I think the broader strategy objective could be something like moving from mass digitalization to protection and inclusion or inclusive development, no, as the speaker has said.

     So I think that's an interesting thing to think about.  Because one problem that I think we have is when we talk about World Bank and governments et cetera sometimes the human rights critic can seem like obstructive or like we are against technology or against progress or whatever.  And it is not true. 

     What we want is to build systems that are inclusive, that protect people, that do not leave anybody behind, no, in UN language.  So that is my point. 

     And another point that is interesting to think about is from an international cooperation perspective, there is no -- there has been no sensitivity to fragile regimes.  You know, Laura mentioned Afghanistan.

     But we also have Myanmar where the Austrian government and the World Bank at some point were thinking about how to promote the digital ID in a place with -- and we also have other countries, Sudan is another one which is recent  -- so international development assistance on digital IDs and fragile regimes I think is something we also need to think about. Thank you.

     >> MODERATOR: Thank you very much.  And I think you said something that reminded me of something that Mercy mentioned is a lot of the times we are working from a point of resignation where we've said to ourselves okay, the systems are here and what do we do about them.  I would like us to move away from that and start asking why do we need the systems again.  Because I think in the beginning we asked ourselves that question and we have a whole campaign around it.  And the campaign leads here from Access Now called why ID because we need to know in the first place before we even start litigating on how perfect or imperfect these systems are, why do we need them in the first place.

     >> AUDIENCE: Thank you.  And yes, to build on that, and to follow up also on what Jose was saying, I think that there has been a collapse on concepts where identity and legal identity has been basically conflated with digital identity and that has become the goal and target. 

     Digital ID should never be the goal or the target.  It is a tool.  Therefore I think that the way in which the work that we do as civil society is framed as negative as that we are always basically complaining about everything, and we don't like everything.  We don't like identity apparently at all.  We don't really like the right to identity itself because of that conflation. 

     So this needs reframing in identity can be accomplished through a variety of tools, right.  And if we manage to separate that we have there a positive narrative.  We are pro identity.  We are only addressing is this the way to accomplish that?  Is the digitalization of identity the way to accomplish that?  And if so, is this the right way, the specific implementation which is different in every country is the right way to accomplish that?  And I think that reframing this narrative is urgent.

     Because again we are -- now I find extreme -- we are being painted as the bad guy in this position.  Like we are against a very basic human rights that doesn't make any sense that we are against it.  So that separation of concepts I think is at an important place to begin with.  Thank you.

     >> AUDIENCE: Thank you.  I would like to suggest a few comments regarding to laws and making framework different policies or think about disability. 

     Female, young and children.  When we are considering we have to encourage female scientists in Africa as well as the world.  So how many scientists in the world are female?  Everywhere in the world men dominate the world or scientists.  The females, the reverse is true. 

     The second one, you know, entrepreneurship capacity in terms of how to think about to economy activity.  So we have to think about our childrens and disabilities and our young childrens to be producing capacity.  And another is to be manager and leadership to take into the Ethiopian consideration.  35% of all leaders are women.  The president of Ethiopia is a woman.  They should help to encourage young children. 

     And disability, disability must also should be Prime Minister as disabled, President as disabled.  They can't.  So we have to introduce disability and elders. 

     And another thing how to give capacity building for our disabilities, female, young, and childrens.  Without, you know, children's ethics, how do the world become positive.  We have to think about in terms of laws and code and regulation, think about how ethical consideration of children in this world. 

     And then finally think about, you know, digitalization and indigenous knowledge regarding to innovative and grant for -- maybe it is skill or maybe -- maybe new knowledge.  So we have to think about all this in terms of regulations, laws around policy and think more about disability and more about the woman, more about childrens.  We have to think about this.  Thank you.

     >> MODERATOR: Thank you very much.  Yes, inclusivity is definitely something we need to think about building around accountability and digital ID systems.  We will come to you and then to you.

     >> AUDIENCE: Thank you.  I'm Juan from Bogota, Colombia.  I was trying to think whether we change strategies holding governments accountable.  I have a rather specific question.  Not anybody in particular, but for the whole table. 

     We are currently seeing the justification being either security or efficiency in most cases.

     But I have crossed -- across our research we have not been able to see much data on that.  There is a lack of evidence to prove that those systems are more efficient.

     And I understand that it is a very double-edged knife to ask for this kind of evidence to hold those powers accountable because if you ask for this you are asking them to gather the evidence in terms of security which is the thing I'm sure we don't want to do.  But I think the angle of attack to the problem may give another sort of solution.

     Which is aimed at starting a conflict between the providers and the demand.  And thinking of that perhaps not complete -- I don't know how to express this, the difference between interests in those groups on expanding on that might be something that could challenge the systems as a whole, I think.  I don't know.

     But perhaps it was not as specific as I thought.  But if you know of evidence for that, that would be great, you know.

     >> MODERATOR: Okay.  So we'll get one last intervention and then come back to the panelists for closing remarks and answers to some of the questions around the room.

     >> AUDIENCE: Thank you.  My name is (?), I'm from Ethiopia from the judiciary and vice president for federal (?). 

     And it is good the panelist has given an insightful presentations on digital ID infrastructures and I got a lot of understanding from your presentation.  Thank you for that.  As I came from the judiciary, I want to raise a legal question.

     The first one, is it possible to make a digital ID registration as a mandatory to get different services?

     Because I think constitutionality also in the international conventions is the right to privacy has been duly respected, one.  Therefore, if we make it condition precedent for persons to register in order to get different social services, then there is going to be a case I think there are also cases in Kenyan, also in Uganda because they said that we have been excluded and don't get the services and therefore it is against our privacy.  And therefore I think laws, digital ID laws and privacy laws should go in line with international human rights standard.

     If we don't the issue come to the Courts and finally may create its own problems. Therefore, are we going to make it mandatory or make it voluntary?  Even if you make it mandatory then it may go against their right to privacy.  If we let it as voluntary then the number of people who will register is low and the benefits that we want to get from digital ID registration might be low. 

     Therefore, how are we going to balance mandatory and voluntary in order to tap the benefit from the digital ID? 

     The second one is on making voluntary the public to register.  I think sensitization and capacity building is very critical.  Even if you make it mandatory, then the people will not come.  They need to know the benefit of register digital ID is there for.  I think that is also a very important thing. 

     The third thing I want to add is intra-operability is also very important.  If we add different datas again the right to privacy may be minimized and therefore the data that we need to collate should be minimal for the sake of giving services, for the sake of identifying that person from another person to minimize fraud and other thing.

     Therefore, intra-operability of datas in the registrations is I think also very critical.  As my country Ethiopia we have entering that digital age and the government has enacted digital 2025, a good comprehensive strategy.  And also there are different laws on the pipeline that the parliament should enact like the digital ID program and privacy protection and data protection issues.  Therefore if we make these laws and policies work then I think Ethiopia can also join the digital world.  Thank you.

     >> MODERATOR: Thank you very much.  We might not be able to take the question but feel free to follow up after the panel.  I just want to say a couple of things before I throw back to the panelists. 

     A good number of things have been mentioned from the interventions from the room.  And I know, for instance, we mentioned just now intra-operability.  But in the same sense since we are talking about accountability if we have that we need some form of transparency and some form of openness.  What's being done and who has access. 

     We don't get that general base of the national security argument so that is something we need to think about and I'm sure some of panelists would mention that in response. 

     I would like to come back to the panelists and just have everyone go around and two or three minutes and mention perhaps in response to the questions as well as in closing from the conversation that we have had today and from the work that you have done what does accountability mean to you? 

     What would make the most sense and what would it mean for you when it comes to the point and you say now we have accountability or now we are heading towards accountability because as Mercy rightly mentioned, it keeps shifting. 

     At what point are we heading towards accountability?  I will start with Yussuf and then Mercy online and then we will go around.

     >> YUSSUF BASHIR: I think that is a beautiful question and I think it's linked with the practical question that was asked about what should civil society do, what is our strategy.

     I think that we would immediately take you to analyzing why the key actors who is doing what.  And what are their issues.  I think the panel has three -- to analyze that very well in terms of saying governments, and then there is corporations, and it is these two actors that are sort of, as Laura put it, playing a game of ping-pong.

     So the question is what should civil society more broadly, the people and all of the actors, what should they do.

     Corporations are private entities that are profit motivated.  I think there is a lot of conversation about business and human rights in terms of holding them accountable.  But I think it is really quite -- it is an area that is really developing.  And hasn't yet got there in terms of holding corporations accountable. 

     I think there is a lot of work that needs to be done around that and other conversations to be had.  The question now evolves to issues of power influences politics and governments.

     Governments are able to, you know, through taxation, through regulation, hold these corporations accountable and they themselves are not a monolith.  Governments are influenced in monolith.  As a lawyer, they say if the only tool you have is a hammer, everything begins to look like a nail.  So we find that I think the panel consists of quite a lot of lawyers and you find litigation and that is why I was talking about the judiciary.  A lot of the things are heading to the courts because that is the tool that is common in terms of holding governments and by extension the corporations that are pushing the agenda forward to account.

     But I think we need to -- sorry to say this, but we need to sort of step back and think about advocacy more broadly in terms of how do we influence the parliamentarians and ministers executives when the considerations are being had and do a lot of power analysis.  I like the question in terms of a lot of push is efficiency or security, but nobody has really unpacked that.  Just a preparation of what that is.

     My feeling is that the real motivation in the end will be about money because that's what a lot of things are always about.  And I think that that is what we really need to analyze properly, do a lot of research in academia and campaign around it.  And I think a lot of success can be had if we push civil society to ensure that people are held to account with regard to ensuring that if we have to roll out digital IDs then they are rolled out in a manner that respects fundamental rights and freedoms.

     >> MODERATOR: Thank you for that.  Mercy, are you still with us?

     >> MERCY SUMBI: Yes, I am.  So I think the question that resonated with me is what is the goal?  What are we trying to achieve?  And I ask myself the same thing and honestly all of this conversation is centered around putting the person in the middle of the conversation.

     How we have been approaching digital ID advocacy litigation up to this point is it is about the data, and it is about the government's services.  But now we need to sort of what is the focus?  The focus is what Yusuf has been talking about and saying the effect is discrimination.  The effect is people who have already been excluded are going to continue being excluded.  Put it in the hands of the authoritarian government, it looks like genocide and torture and abuse of rights when it comes to human rights activists. 

     If we center the conversation so that at all times our eyes are on what is happening to the person, not the data.  Because privacy conversations tend to, I don't know, separate the human being from the data as if you can look at them differently.  And you can't.  And the one plus with that kind of approach is when it comes to the accountability then your bottom line is what is the human right effect of what is happening.

     It is no longer about standards, it is no longer about what the act provides because the act could be wrong, the law could be the problem.  Even the minimums we are talking about here, they could be bad minimums. 

     But if you step back and say human rights next to this system then that is the bottom line.  When you do a human rights audit on the entire system.  Are there people, for example, who are being disadvantaged and there are people who are being targeted, human rights activists, journalists, for example.  If that is one of the likely results, all of those of human rights issues.  It just hit me that a lot of the times when governments say government services they don't mean like free medical care.  They mean like your ability to register a company on the platform or your ability to get a passport.  Those are secondary rights. 

     They may be some form of human rights, but they are not your main human right which is your right to dignity.  So that to me is the beginning and the end of the conversation.  Let's center the person and then judge the system from what it does to the people in the country.

     I do like the suggestion about changing the strategy.  Listen, it is like a computer.  If this is the first time introducing a computer they will be so impressed.  You touch this button, this thing comes alive, and they have all of the icons.

     So the government story is easy to tell that we can come up with a database and it will be easy to identify and demonstrated to everybody.  Our case is much harder because it is based on a lot of hypothesis. 

     So this is where now investment in research becomes very, very necessary.  We need to sort of match the research, not the research but the demonstration with our own scientifically backed demonstrations so when they are saying it is good because it produces this result we can use the exact same machine and say have you met the dark web or this side of the exact same machine? 

     That is how we will be able to get through also to the judges.  Because right now it just looks like one person has an expert painting a good story and one has at best that there is a worst case scenario that could happen in a million years.  Whereas we know them to be real and we don't have scientific evidence of that. 

     Something else I wanted to zero in on the whole issue about the digital ID how come now all of a sudden they are confusing with legal identity.  And that is what happened and that is the natural result of focusing on the data and forgetting about the person. 

     If you are wholly looking at we need the system what you end up is the mandatory requirements to access this then you have to have a digital ID.  That cannot be allowed to happen.

     And how we fix it is changing the strategy and also changing the conversation, the nature of the conversation.  First stop with the accepting defeat that we have to have a system like that.  This is why we are having the system.  But at the same time what are the bare minimums as a community?  We want respect for human rights and then judge any system that comes up from that basis.  Thank you.

     >> MODERATOR: Thank you very much.  We have about three minutes so I'm going to ask that you be very brief in your closing remarks.

     Laura and then Elizabeth and then Thomas.

     >> LAURA BINGHAM: Quickly and picking up on putting the human at the center. 

     I was having thoughts along similar lines and just taking a step back, you know, on that point of conflation between legal identity and digital ID. 

     I think that what we also missed when that conflation took hold, maybe two things.  So the conflation happened in global spaces where the underlying assumption and you are working on sort of international relations and international law framework and so all states in the international system are fundamentally equal, which I think we all know is a drastic fiction. 

     And but that is how that conflation was easy to have happen and most of the actors that are kind of feeding into it and feeding off of it is moving in the space.  And that meant that also the legal subjecthood of the people that we have been speaking about today was assumed away.  And also human rights, the full panoply of human rights was really drastically reduced. 

     I wanted to react to the colleague from the judiciary, thank you very much for putting those questions.  I would love to talk with you more after the session.  I think from the human rights framework there are a couple of things from my research including for the case in Uganda around economic and social and cultural rights that I thought I could point to.  And also from disability rights like the other colleague had brought up. 

     So the international convention on economic and social and cultural rights has a concept of non-retro aggression which is a really under-theorized and underutilized not well understood concept.  But it basically means that the states that have signed up to that covenant, which are many, can't go backwards at a certain point.  Like Mercy was talking about irreducible minimums.  And the framework of accountability that is real should mean that there is a new set of -- a new playing field for state obligations that are going to think about using technology where we really have to define that floor and define it from the perspective of what this progressive realization mean now because maybe it co mean something different if you want to think positively about the new tools and they really are only tools. 

     And the other point is from disability rights from those of us working on nationality law and citizen stripping and denial of analog identity, the disability rights convention and just the whole framework of antidiscrimination around disability rights is really powerful and incredible.

     And there is a specific article on the right to identity and the right to citizenship in Article 18 of that convention.  And what disability rights in general the standpoint is a structural one from the get-go.  Look at the way that entire systems and the functioning of society works and how it doesn't work for a whole diverse subset of the population.

     So I mean that kind of approach to thinking about rights and equality when it comes to digital ID should be the fundamental starting point.  And so I think it goes beyond privacy but there are some really appraising protections and some privacy cases in the mechanics of how a fundamental right should be analyzed by the judicial actors and therefore understood by other actors in the government system. 

     And that has to do with people, the people who are affected having a minimal burden to show that a right is engaged and bare minimal burden and then shifting the obligation, the serious obligation onto the state in this case or also onto corporate actors to come back and substantially show with the substantial evidence there are no less restrictive alternatives than the path that has been chosen. 

     And that has to come with some serious disclosures.  And if the actors understand that is the threshold of accountability, then that does actually change the behavior outside of the room as well.

     >> MODERATOR: A common thread I'm hearing is focusing on the human being in the center of all of this and making sure that they have the least responsibility and shifting that to the people that are actually running this programs.

     I'm very much onboard with concentrating on the human being and that is what human rights is the right of the human.  We need to reshift the focus.  Thomas.

     >> THOMAS LOHNINGER: You asked the biggest question at the end.

     I think we can go back to accountability needs to be measured by the goal of the systems.  And someone really asked the right question like are these claims actually true?  That frame helped us a lot in the debate today.  Do the systems make us more secure?  Or Isn't the purpose of the security camera, isn't it social control?  Isn't it the sculpting of society where people go to protest and so forth? 

     In a way we ended up with the opposite of the goal, we were dreaming of the machine readable state and now we have machine readable citizen. 

     And the second dimension for accountability is the technology that we use.  And privacy by design has to mean there has to be intra-operability, but we have to have the need to know principle and the user has to be in control and see who access their data and has to prevent any observability of any data. My university shouldn't see where I use my diploma and which potential employers I'm applying to. 

     So should the government not see where I prove my age to enter an establishment or to buy something.  Unlinkability are solvable problems in the way the technology is the easiest dimension to solve.  And the third really is the effect on the citizens on the population and their human rights impact assessment and actually trying to dig deep in what this will mean for particularly the marginalized groups in our society. 

     I think that's always the obligation of digital rights, we have to think to the end that there are horizontal platforms that is even more important.

     >> ELIZABETH ATORI: I'm going to make mine really short.

     I believe the digital ID doesn't operate in a vacuum.  It is supposed to work within systems that are established if that includes birth registration and making sure that we build infrastructure that supports the digital ID system. 

     To ensure that everyone is included then that is something that we need to adopt as an approach instead of just getting the system wholesome and putting it there and expecting it to work. 

     So in terms of that we need to do more specifically even within our systems we need to improve the access to information for the user.  These people that are included in the system making sure that they are aware of their rights and making sure that they are aware of which platforms have been availed for them to be able to air out whatever views they need to air out and making sure that we build a system which actually reflects our vision, our goal of ensuring a more inclusive society for all. 

     And if are doing that then we are able to build trust in the system and would never have -- would never have a situation where people refuse to go and register.  Because if we have systems in place, if we have enrollment centers close to the people, if we have with the population that is generally illiterate being able to break down the system for them so they can understand more and more, you find that you are building trust in the system.  And the real reason we're having the discussion is because there is a little bit of distrust that comes as the price of security and more expansively the exclusion that comes with it. 

     But really the bedrock of the discussion is the mistrust that the people have with the digital ID system.  And if we want the system to work, we need to tackle that mistrust.  And that includes making sure to include everyone in the design and implementation and operation and thinking through and strategically asking ourselves the question like we have asked is it a system that we maintain, or do we allow room for other forms of identification to help people access the services that they need. 

     And that is a question that we shied away from even on forum when it is brought up it is not a question that people are willing to ask because then it seems that that question would in itself bring down the very system which they are developing. 

     But more and more we have seen that it is a question that needs to be answered.  Do we allow alternatives?  Why are we making it mandatory?  How then do we ensure that everyone is included?  These are important questions that expand beyond a civil society that go to the core of government services, that go to the core of multinational support that is being put into these systems. 

     So I believe it is something that we need to sit objectively to discuss and find a remedy for.  Otherwise then we will go around in circles five years down the road we are back on the table asking ourselves, highlighting the exclusions but not really providing a remedy for the exclusions that are happening.  That is my take on this, how do we go forward.

     >> MODERATOR: Thank you all so much.  This session has been useful for me, and I hope that it has been useful for all of you in the room.  Thank you for being here and sticking with us to the end.  I know we took a little more time than we intended to. 

     I appreciate you coming here.  I know we are coming to the end of the IGF, and I hope you had insightful sessions before this one and maybe you might attend one or two more.  Again, thank you for your insights and being here with us. Have a lovely day.