IGF 2023 – Day 0 – Launch / Award Event #150 SAFEGUARDING PROCESSING OF SOGI DATA IN KENYA – RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> JEREMY OUMA:  Good morning, everyone.  My name is Jeremy Ouma, and I work for South Africa ARTICLE 19.

     I'll probably allow my colleague to introduce herself first, then we can get into it.

     Also, another colleague of mine is on the way.  I think he will introduce himself as soon as he gets here.

     Thank you.  Over to you.

     >> ANGELA:  Good morning, everyone.  My name is Angela Mayalo.  I work at an ICT policy think tanks that is based in Nairobi, and I am interested in the topic because I think data an logs a lot, but it can be an area where if not well regulated can lead to further human rights violation, so I'm grateful for this panel, and I await questions from Jeremy.

     >> JEREMY OUMA:  Okay.  Thank you, Angela.

     So, I'll probably start with a brief overview of what this session is about.

     In essence, we have done some research, basically a very brief, brief by (?) South Africa.  As I've said, I work with ARTICLE 19 in Africa and we work on freedoms of expression, association assembly, both online and offline spaces.  So, specifically for this session, we have this brief produced as part of the digital voice project supported by the GI digital transformation center in Kenya.

     And the paper basically provides an overview of the processing of sexual orientation and gender identity data, specific to the Kenyan context.

     We will also focus a bit mostly on matters the LGBTQ people face privacy and cisgender and populations basically in Kenya.

     So, what we hope to do with this paper is to increase awareness of data protection by different stakeholders in Kenya, be it regulators, be it activists, and the general population. 

     And by doing this awareness, people get to understand their rights.  That's for whoever is part of the community.  That's one.

     Two, people also understand the obligations that be data controllers, data processors, and adhere to these data protection laws, building the trust needed in the digital economy, and promoting the access of freedom of expression and access to information.

     So, I'll briefly go into some of the findings that we had before we get to some questions with my colleague here.

     Let me briefly go into some of the findings.  Okay.

     So, as I've said, briefly some of the findings.  One is the insufficiently good protections for LGBT people, especially in the Kenyan context.  Most laws or rather the practice is outlawed not being identifying as LGBT in the Country, but the practice or the action is outlawed, so this has negative impacts on disclosure of this kind of information sexual orientation or gender identity data.  Disclusion and collection, especially be it to hospital, be it in banks, and the legal frameworks doesn't provide protection from this discrimination.

     There is a blanket protection for everyone that is equal under the law, but there is no specific protection for this kind of discrimination on the basis of sexual orientation or gender identity.

     That's one.

     Plus, the social cultural context continues to have a great impact on the treatment of sexual and gender minorities in the Country.

     We have this thing that we keep throwing around that it's not our culture.  That's thrown around by Government, by leaders, so it's not a very good environment.  That's one.

     I think the second finding is about differentiated treatment between SOGI data and of the legal framework currently in place.

     So, sexual orientation is classified as (?) personal data, but gender identity is classified and processed as personal data.  One is processed as sensitive personal data, the other one is general data.

     And Section 2 of the Data Protection Act covers mostly ‑‑ most things under data protection in the Country.

     So, this means that data controllers and processors processing this kind of data must differentiate and high levels of protection to sexual orientation data, despite gender identity also exposing data subjects to similar risks and consequences.

     That is finding 2.

     No. 3 is restriction of SOGI data collection between both public and private sectors.

     In the Country, most, if you go for example to a bank, to a hospital, and you need to collect data, the categories are majorly male/female.  Sometimes you will see intersects.  Sometimes they will just put “other.”  So, this is basically attributable to the (?) law to recognize that other gender identities sexual orientation and this kind of people.

     So, this paper was guided by some contribution by key stakeholders.  We did some key interviews.  We had some focused discussions with key industry players in the Country specific to Kenya.

     So, I think I'll leave that as the key ‑‑ way to key findings, but we also have some other findings, and go to some of the conclusions we drew and then some recommendations, then after that we can speak to my colleague and have a brief overview of the current situation.

     So, I'll go straight to some of the recommendations, specifically for 1, data controllers and processors.  I've divided it into two.

     So, we have recommendations for data controllers and processors, and the second is ‑‑ let's call it the Civil Society.

     So, one for data controllers and processors is basically around compliance.  So, all public private organizations and individuals processing personal data are required to register with ‑‑ let's call it the regulator.  Not regulator, but we call it the ODPC.  The office of data protection commissioner in the Country.  As stipulated under the Data Protection Act.  So, we encourage that for anyone that is processing data, be it Government, be it a hospital, be it a business that you require to process data.  Recommendation 2 is to implement technical and organizational measures for compliant data processing of this kind of data, this kind of sensitive data, including doing a data protection assessment, data protection impact assessment prior to processing this kind of data, and also engaging this communities.  That's one.

     Two, appoint a data protection officer to oversee compliance with the laws.  That's the Data Protection Act, and the relevant privacy and data protection laws.  That's two.

     And for entities in the public and private sectors to prepare updates, data protection policies and notices so that they're up to date with the needs of the community.

     Finally, basically awareness, and that is internal awareness for these entities to have privacy aware culture so that you're aware of what you need to do when processing this kind of data, and how to handle that kind of data in the right way.

     I think I'll go to the very last group.

     This is for basic Civil Society.  These are the Civil Society actors.  It's one, to build an internal capacity and take training to one, understand the frameworks of data protection, the impact it has on the communities we work with, and two is also once you have this knowledge, you engage the public and Private Sector entities to also create awareness for them to understand the impact this processing has, to understand what are the obligations under the laws.

     And finally, is to advocate for the data protection commissioner to expressly include gender identity as a form of sensitive personal data.  In light of the risk of significant harm that may be caused by the processing to data subjects, it's also important to have this captured by the data protection commissioner and acknowledged and have frameworks to protect this kind of data.  That's 2.

     And finally, is to advocate for better laws to remove the discriminatory laws, whether it's repealing them or amending them to be in line with international standards.  So, yes, those are the kind of key findings and some of the recommendations we have in this ‑‑ on this brief.

     I think I'll go to ‑‑ if anyone has any questions up to that point, I'll be happy to take them before we go to my colleague.  Then we can have a discussion about other experience in the Country.

     >> AUDIENCE:  [off mic].  So, from your presentation, it seems that there is like a kind of embedded contradiction, which is that on the one hand the diverse gender identities are not recognized, right, officially.

     >> JEREMY OUMA:  Yes. 

     >> AUDIENCE:  But at the same time, there is that risk of, like, if somehow, you know, banks and medical facilities.  That is kind.  Focus of data protection, but in parallel, is there kind of a movement or a drive towards kind of having these recognitions uniformly, you know, instituted?  Is there ‑‑ you know, again, is there a drive or a movement?  Is there something happening in Kenya?

     >> JEREMY OUMA:  Thank you.  I think that's the only question.

     I think yes, there is a drive to do that.  For the first couple of years there's been a drive to repeal a section of the Penal Code.  I think it's two sections, 162 and 163 that criminalizes the act, not necessarily the person.

     In Kenya, identifying as diverse, it's not criminalized, but the act is what is criminalized.  So, once you're found ‑‑ but most times the law builds (?) denied registration for organizations, but there's been some good precedence.  I think there's been a long court case going on between the regulator that ‑‑ is it called the NGO board? 

     >> Angela:  There was a petition to the constitutional court to declare section 162 of the Penal Code to be unconstitutional based on the discrimination ground.  That petition was not successful, and section 163 is still operational in our Country.  So, Kenya's policy around it is to act like they don't exist.  So, whenever they asked about it, the Government only says that that is not a priority for Kenya, that we're a third world Country, and we have more development issues to be concerned about so what has helped Kenya in the LGBT communicate it too e is our penal progression.  So, it is our Colonial laws that we inherited from the British governance systems, but the Constitution is actually 2010, very new in terms of constitutional law practice, very, very new in constitutional making.  And our constitution has the right to non‑discrimination very strongly, and the Bill of Rights, which enumerates human rights.  So, when LGBTQ people are being denied registration they went to court ask the court went up to the highest court, which is the Supreme Court, and the courts ruled that while what they're doing might not be a violation of the Penal Code and the section 162, they have the constitutional right to assembly and association and all that, and therefore when they register or their NGOs refuses to register them, then they have contravened the Constitution.  So, it is the robust necessary of Kenya's constitution on the right to privacy on article 31 of Kenya's constitution, and now the Data Protection Act that you see are very progressive human rights outlook, but we still have this elephant in the room which is section 162, and you can see the contradictions.

     I hope that gives you an idea of what we're working with.  Thank you.

     >> JEREMY OUMA:  And to also mention that there is a lot of pushback from Government.  They're not, for example in the petition, the register tried to say that you're not supposed to do this, so there is a lot of pushback from Government.  They're not ready to have these conversations.  I hope that answers you.

     Thank you.

     Then I'll go straight to a couple of questions for my panelist here.  The other one will arrive, I'm not sure, but he should be on the way.

     So, we can basically start looking at the legal framework for processing of data.  Not necessarily just sexual agenda kind of data, but in general, what do you think ‑‑ what is the framework defining this processing of data? 

     >> Angela:  As I stated earlier, it starts in our constitution.  The right to privacy.  Then we have international human rights commitments.  Some of them you know them.  The international Government on political rights.  The African convention of human peoples rights, so these are the bases and the frameworks for the right to privacy.  And then in 2019, we operationalized the Data Protection Act.

     Like many other African countries, it is a (?) EU GDPR, and that comes with pros and cons.  So, some of the pros of the GDPR provided a very good framework with (?), an independent office, I will put independent in quotes, because you can say it's independent, but who is appointing.  So, there ‑‑ independence is a bit ‑‑ I won't say independence.  I will just say it has a body.  The independence is questionable.  So, we have the Data Protection Act, and it's a very elaborate framework, and I'm not going to focus so much on the downsides, other than to just say that just like the EU legislation, we expect that when data is being transferred from Kenya to other countries, there's enough (?) to provide equal or similar protection for data that is being handled in the third countries or the recipient countries.

     I hope that answers it.  Maybe we can go to the sexual orientation on gender identity.

     So, I work in (?) as a gender digital rights program officer, and last year we also had conversations around gender and data protection, and this conversation panel to be had without talking about sexual orientation and gender identity data.

     Just a fun fact even before I delve into the Kenyan ecosystem is that there is a have progressive approach to data protection from the South and African region.  I don't know if you knew this.  South Africa, what are they called ‑‑ the regional block in South Africa, SADC came up with a model framework for privacy law for the member countries that belonged to it, and they put agenda as one of the personal or sensitive data.  So, while Kenya just talks of sex, they talk about gender.  And there is a difference.  So, when you just talk about sex, you're referring to by logical sex and this is what is assigned at birth, so male or female or intersex.  But when you're talking about gender, you're talking about someone's expression, and that might sometimes not align with their biological sex.  So, that is very progressive.  I always tray to talk about the side model, because it took a different approach from GDPR.  Something we want to see more of regional blocks and countries taking their own approach to data protection in the way that makes sense for them, but also in a progressive way.  So, I would just like to mention that.

     So, in Kenya, you will find that gender identity is not specifically provided for under sensitive data.  It is treated as personal data, and that means that it can be processed, one, when there is consent, and two, even when there is no consent when the data processor or the data controller can prove that that information or that data is necessary for performing certain tasks.  So, they'll say you entered into a contract and part of my obligation was to do A, B, C, D, and that means I have to process your data to do that.  So, that is there is no content necessary in terms of personal data processing.

     How is this a problem?  It is a problem, because we can still see how gender data or gender identity identifies to human rights violations.  It could be job applications.  It could be loan applications.  So, it can lead to further violations.

     Now, for sexual orientation, I think we've already set the scene for you to understand how sexual orientation is grappled within our system and it is a crime, and the state crime is still protected in the data protection framework that says contradicting viewpoints.  And it's not just sexual orientation data, if I may add, health data has also been one of the things we've grappled with, because health data is dealt with in different frameworks.  So, there is health acts, which I'm powers health practitioners to collect data necessary to perform their work, and at the same tame, you are seeing that health data is sensitive data that's ‑‑ that cannot be processed, and that's a given.  So, this is something you will keep seeing in countries where they pass Data Protection Act, but don't review or reform the laws that existed before, so you end up with a very interesting set of laws, if I may say.  So, yeah.

     So, when you have ‑‑ when data is deemed sensitive, it means there is more safeguards towards its protection to find that if there is no consents, then the data controller or the data processor must prove the necessity of taking this data and of collecting this data or processing this data.  Again, that falls in data minimization, that we want you to only collect data that truly is necessary for what you're trying to do.

     So, what happens when you put sexual orientation data and gender identity data separately?  Let me give you an example.

     So, you're seeing sexual orientation data is protected, right.  But gender identity data is not sensitive data.  It can be processed in any other way.  But when we create links between data sets, we can be able to tell that this is Angela or this is Jeremy, Jeremy is male, male or female, that is not protected.  We can tell.  Male.  Jeremy is on a sex app that is for queer community.  So, that tells us that while Jeremy is male, we can tell that ‑‑ we can identify Jeremy's sexual orientation from the apps he's using.  So, we need to have a harmonized protection framework that protects Jeremy both for his identity as male and his orientation as queer or, and this is, of course, just for example purposes.  After this meeting don't arrest him.  But you get the point that we need ‑‑ what I like to say is we need an equilibrium or spectrum of protection that cuts across and doesn't end and stop at a certain point.  Yeah.

     >> JEREMY OUMA:  Thank you, Angela.  And you've preempted my next question about probably what is your experience with the practice of processing of data, especially sensitive data? 

     >> Angela:  Thank you.  I think for this talk to be important for the people in the room, I'd really like to talk about processing of data in non‑profit entities.

     So, for a long time we've been talking about data protection from, it's company's compliance.  Actually, I hate the word compliance, but we have to use it.  The message is regulatory and compliance.  That is what companies do.  Nonprofits comply when they're sending financial reports to donors, like compliance is a very foreign word to non‑profit entities.  Yet you will find nonprofits organizations process a lot of sexual orientation in gender identity data, which if mishandled has serious human rights ramifications.  So, we need to understand data protection as something that applies both for non‑profit and for profit entities, the companies in this case.  So, you will find even how we have conversations around data protection, it's oh, big take, and of course I understand why we do this.  Of course those are the examples that make the most impact and the most sense to the people in the room, about it we also need to start talking about processing of data by non‑profit entities, because what will happen for instance if a non‑profit like ARTICLE 19, for example, you operate in Kenya, and Kenya is becoming a maybe a Draconian state, and the state can have access to your documents, do these organizations have a plan?  Do they know how to fight back when this data is requested from them?  We keep saying oh, (?) is such a good company.  They would never comply with request for information from Governments.  We ask the same questions when it comes to nonprofits entities.  So, I think it's very important to have those conversations of data protection, even from a non‑profit point of view.

     So, from practice, what is very (?) is this idea data protection is a concern for companies and not a concern for nonprofits, yet you will find that the people who handle most of the sexual orientation and gender identity data, they're doing (?) in these areas and collecting data in this area will be non‑profit entities.

     Another worrying practice I've seen from my Country, I'm speaking from Kenya's perspective, and during the question and answer forum I would like you to invite us give us perspective from your countries, is that there's so many myths and misconceptions around data protection and processing.

     I've give you an example.  So, last week, from last month, let me just say last month, because I've lost sense of time, is that our office of the data protection commissioner issued penalties, penalty notices to three companies for breaching Data Protection Act.  In the one of the entities that was fined was a club.  A place people go to have fun, a party.  And they were taking photos of people who were taking (?) in Kenya, for some reason, clubs have these obsession with taking photos of frivolous having fun at their joint.  I don't know why.  I don't know if they won't make enough sales without ‑‑ it's a whole (?) minimization in the city principally.  Do you really need it, but they did, and they ended up being fined because the data subjects complained about them to the data protection commissioner.

     I'm also meant to understand that it's not ‑‑ we should not take it for granted that our Government can issue penalties.  In some jurisdictions, the investigative powers and pass with penalties had tales, so just wanted to put that as a side note.

     What the other clubs have understood from this penalty notice is to putty mow gees on their faces over photos they take. 

     >> AUDIENCE:  [laughter]

     >>  Angela:  They did this immediately after the penalty.  That is how serious Kenya is.  We use humor to get through.  It's a very tough place to live in.

     Anyway.  The point is they think putting emojis on their photos is compliant with the data protection.  Exactly, so this is a very pedestrian approach to understanding data protection, because if we have applications that can we move emojis and the anonymized photos, they are not complying with the protection act, and there is no consent and all that.

     So, I'll end it at that, because I think it's a light note and tells you what the problem we're dealing with.

     >> JEREMY OUMA:  Thank you.

     If you have any questions at all, we will take them at the very end.

     I think I'll just throw a couple of more.  Kind of building or not, you have just highlighted, is there some sort of (?) impacts that you have seen from this kind of processing of the sensitive data or just any data in general? 

     >> Angela:  Actually, I'll give an example of the activity we were doing in (?).  Last year we were doing something as a gender Internet Governance exchange, and our voices off project by APC, and we had people from the queer community as part of the (?), and before we used (?) with women, and we will take photos and post them, you know, part of the reporting, but also the social media campaigns.  And they told us that some of them are closeted and putting their photos online in an activity that is clearly for queer people will put them at risk.

     I get comments that data protection should be some very, how can I say this?  This is serious.  This is about penalties of 50 million in the EU and Facebook and meta, but that minimizes the harm that such data breaches can have on normal ordinary people who are not celebrities, who are not in the EU, and therefore their complaints cannot attract the penalties that are in the European Union.  So, understanding that in the context of the stigma and even debts we've seen in our Country against queer people, that is a serious risk we need to have in mind.

     I'll give another example of the homophobia we saw online.  Once the Supreme Court made a ruling allowing LGBTQ (?) to be registered.  What that understood to mean was that Kenya has formalized LGBTQ relationships, which is not even the case.  Like, I was like we wish that was the position.  It is not.  It is not.  And the homophobia, the messaging online is we will kill them.  I'm never going to accept that.  And I kid you not, it was not even just from people online, it was even from the leadership at the national level.  So, when there's some understanding that this is an undesired people among us, it also warrants hate or justifies hate or motivates and insights hatred against that group.  So, being queer on its own, existing as queer is a political act in Kenya, and in certain other countries.

     So, let me just end it at that.

     >> JEREMY OUMA:  Okay.  Thank you.

     I this I the final one will be probably, do you have any recommendations or insights on Best Practice for collection and subsequent processing of this kind of data? 

     >> Angela:  Yes.  First of all, I liked ARTICLE 19 to actually public ‑‑ I just went to call you out here [laughter].  You need to publish this resource, because they have amazing templates that people can use when processing data.  Data protection is a very complex principle.  I have to always remind people that I'm talking about it can't cover all the bases in one topic or in 45‑minute panel, but there is need for more resources.  Not just for profit entities (?) have enough money to get the DPOs, to get the people to help them comply, but what happens to nonprofits whose resource rest quite minimal.  So, what ARTICLE 19 has done is to come up with templates for nonprofits, and it's like a checklist, which is what we need, because this is such a complex process.  So, telling you okay, you have this data, you have consent.  If you don't have consent, do you have the basis for it?  Have you documented ‑‑ documentation is so important in data protection, because you need to also preempt what can happen in future.  Will you ever need to provide proof of consent, for instance.  Those are things we might not be thinking about, especially operating as nonprofits.  That is the age old data protection (?) that you need to be documenting consent.  Documenting contracts you have with data control ‑‑ data processors.

     So, let me just explain this.  Sometimes you will find that two entities involved in the processing of data.  So, there is a data controller, the person who collects and also directs how the data is going to be processed.  And then you can have another entity being the data processor.  So, these ones are the ones that are going to store the data, anonymize the data, analyze the data and all that.  So, they might have different functions, depending on if they're data control or data processor.  So, sometimes we use these words among people in the data protection Field without explaining what the ramifications are.

     So, to make it in a more simple way is that a data processor is an agent or an employee of the data controller.  So, essentially at the very end, the person who is going to be responsible for all the data protection issues, the breaches or whatnot, the consent will be with the data controller and not the data processor.  So, having them registered with the data protection entities in their countries is very important, because it also gives them the justification for having budgets and all that towards compliance.

     So, let's have this resource online, please, because I think for nonprofits it's very important.

     >> JEREMY OUMA:  Okay.  Thank you.  And just a disclaimer, the resource will be published by the end of ‑‑ in October.  So, sometime in November it will be fully ready.  It's ready, it's just we haven't yet published.  There is a whole process to go through, but it will be published.  The main aim of this resource is to create awareness about data protection.  So, today we looked at some of the challenges, the impacts that this processing has on specific groups, but the paper looks at trying to create awareness about data protection, and also some of the recommendations of best practices for data controllers, data processors, and also lots of Civil Society actors in general.

     So, yeah, I think I may want to leave it at that, but I'll take some questions if there are any.  We can take them at the same time.  Yeah, we can end after that.

     >> AUDIENCE:   [off mic].  Where is the information?  Is it on the Internet?  Is it being [off mic]. 

     Okay.  Sure.

     >> AUDIENCE:  Hi.  I have three questions.  I hope that's okay.  I have many questions.

     My first question has to do with how data from sex workers is being saved and used and protected, right?  Because we know that data from sex workers tends to be more sensitive and nonprofits also work with sex workers and maybe in Kenya that happens, as well, so I would like to know if there's a difference or if you have had any special remarks on the privacy of data for sex workers.

     Then my second question is how the anti‑homosexuality act from Uganda has affected Kenya, and the data protection laws in Kenya.

     And my third question is, how is it, like for ‑‑ for US Civil Society to work with platforms in terms of Pratt form accountability, because we know we have the data protection laws, but how accountable are platforms in Kenya when you register a report or an incident, like how does it work?  And also together with the police.  With a Judiciary system, how is it like there?  So, those are my three questions.

     Thank you.

     >> JEREMY OUMA:  Any other question?

     >> AUDIENCE:  I do have one but [off mic]

     >> JEREMY OUMA:  Yes, please.

     >> AUDIENCE:  Hi.  There is more actually to clarify, because when you are giving the example of the health data sharing scheme, so my clarification is, is there a health privacy act?  Okay.  And in that case, when health care providers are getting the data, what is the sharing mechanism?  Can they share with other ‑‑ let's say, one hospital is taking you as a patient, right.  And if there is some kind of electronic health records, are they sharing and uploading to that repository?  So, it's shared across facilities.  So, every facility has access to that data.  Is that how it works, or every time they need to get ‑‑ you know, consent from the patient.  That's one clarification.

     And second, is that if there is a health ‑‑ health privacy act, and then there is a Data Protection Act, how does the coordination between the two work and what regime is the data privacy is covered under.  Are they covered under the data protection or the health privacy act?

     Thank you.

     >> JEREMY OUMA:  Thank you.  Is that the last question?  Okay.

     Do you want to take any of the questions? 

     >> Angela:  I'll take the health question.

     >> JEREMY OUMA:  Yeah, sure.  Okay.  I'll take the two questions.

     >> Angela:  This is a topic I really like, so I hope you don't get passionate and talk too much, but yeah.

     It's very interesting that you raised it, but we are currently having a Bill being built up in Government in parliament called the ehealth act.  So, the ehealth act is talking about telemedicine, but it's talking about protection of health data, which is very interesting, and I think people need to stop doing this.  They're miss ‑‑ they should have just called it the health privacy, because that's what it does.  And it's providing a frame of consent, so collection will be consent based, and two, is it will be sharing of data across health facilities, and three, they will be health identifiers.  So, they are also going to be assigning unique numbers to both patients and health facilities.

     And four, they want it to be portable.  So, they want to give that control of data to the patient.  So, the patient will be having the ‑‑ all the records in a portable format.  We don't know this portable format, but data portability, and then they will be able to ‑‑ so it's still being debated, but that's what they have on mind.  On how it's going to operate together with the act, you find most of the time saying in the caveat for exceptions and things like that, if prescribed by any other law or on the ground, it prescribed by any other law.  So, that is just normally how we try to enter link laws.  So, if it's talking about prescription by law we go to any other law or any other relevant law in question.

     I hope I answered you, but we can talk about this after this.

     The question on sex workers.  Again, sex workers also penalized in our Country.  So, of course we understand that the situation is different at different places, but we also understand that sex worker is also becoming digitized.  So, there is only funds and so many other, what are they normally called web based apps where they're still part of the digital a con me, and that also means that this data is being processed sometimes outside Kenya.

     Again, the level of our awareness is low even on just data protection accountability on Kenya, so how bad can it be for a user based on in Kenya whose data is being processed outside Kenya; i.e., in the EU.  Those are questions we have yet to grapple with in Kenya, but I'm glad that (?) is part of the global project and this could be part of the research we could conduct to understand how it's being dealt with.  So, just giving you context for those two things.

     I'll let Jeremy go for the homosexuality laws in Uganda and how it affects Kenya and the platform ability.

     >> JEREMY OUMA:  Thank you.  I think I'll start off with a specifically platform accountability.  And just to mention, first of all, there's an ongoing case at the moment between Meta and some of its, let's call them former content moderator.  It's an ongoing case about matters around accountability.

     There was ‑‑ let's call it good precedence where they can now be held accountable for their actions within the Kenyan jurisdiction, but that case is still developing, but it's good progress for us.  We see it as good progress.  That was just first of all, but there is also from our point of view there is ‑‑ there is one of the things or two things we've tried to do.

     First of all, there's a coalition that we've tried to bring together around specifically content moderation.  This is basically we looked at some ‑‑ we did some workaround the current practices of content moderation in a couple of countries, but there is a specific focus on Kenya.  I'll talk specifically about Kenya.

     Basically understanding the experiences and challenges of Kenyans around content moderation, take downs and all that.

     So, some of the things that we found.  Platforms are potentially amplifying harmful content.  There is a lack of understanding of local context.  Because we are trying to have some sort of decentralization in terms of contents modernization, we can hold them accountable for what happens on their platforms.

     There is also insufficient transparency in content moderation, and finally, we are trying to bring together, sort of bridge the gap between the platforms and the local users.  So, that's one of the things we do.  So, we are trying to have ‑‑ time?  Okay.  I'll keep it very brief.

     So, we are trying to bridge basically bridge the gap between local stakeholders, local users and the platforms to sort of get many kind of conversation going on how we can make the platforms better.  So, yeah, I think in the interest of time, there was a second question.  Uganda.

     For the case of Uganda, I think it's very ‑‑ it's not somewhere we want to be, but I think we've recently been hearing cases of people being prosecuted based on this law.  I think there's also some very bad cases, but in Kenya, I think it's very similar, but not as bad.  But I think in relation to how it has affected Kenya, I think there's been some ‑‑

     >> Angela:  There is some potential legislation.  We have the culture Bill.  It started as a family values protection Bill.  Just to wrap up.  Is that these are funded by Eurocentric far right evangelical radicles, and it's really Sad, and it's not Africa, it's actually western ideals being impose owed Africans.

     We'll end at that.  Thank you so much for attending our session.

     [applause]