The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> JASMINA BYRNE: All right, welcome, everyone, thank you so much for joining us in person here and online. To this late afternoon session I would say on data governance for children edtech, fintech and neurotech my name is Sabina, I work at the university and the cofounder of tech legality together with my colleague here Emma Day and we are joined by a variety of speakers both offline and online. I will ask speakers to introduce themselves and I would really like to encourage participation both online and in the room here.
Be critical, ask questions, there are brilliant people who have possibly all the answers but otherwise we will ask you more questions. I think it will be an interesting session so let's get started and let me hand over to Jasmina, she is online for introductory remarks and setting the scene, Jasmina, over to you.
>> JASMINA BYRNE: Hello, everyone and good afternoon, I hope you had a productive day of sessions today. Sabina should I'm Jasmina Byrne
>> SABINA: We can't hear you.
>> JASMINA BYRNE: Hi, I'm chief of policy in UNICEF. Well I hope you all had a productive day at IGF and I am really sorry I'm not there in person. This is one of my favorite conferences but you are in good hands with Hannah, Sabina and Steve one of my good colleagues and we also have online others. And we have Aki enken berg from Finland who is our key partner and this session today is about rights based data governance for children across three emerging domains.
Education technologies, neurotechnology and financial technology. So we have been working with about 40 experts around the world to understand better how these frontier technologies impact children and particularly how data used through these technologies can benefit children, but also if it can cause any risks and harm to children. We all know that globally edtech has been at the forefront of innovation in education. It can helped with personalized learning, we see that the data sharing through education technologies can improve outcomes in education, facilitate plans, administration, and so many other things. Other innovative technologies like neurotech, are currently being tried in various settings and offer great opportunities for improving children's health and optimizing education. Financial technologies as well, allow children to take part in digital economy through the digital financial services.
All of these innovative technologies have also created data related risks. Particularly in relation to privacy, security, freedom of information and freedom of expression. At the same time, we are seeing really rapid introduction as we see rapid introduction of these technologies into childrens' lives the policy debate is a little bit lagging behind so this is why we hope that this initiative will and the partnership with government of Finland.
Will not only help us identify what are the benefits and risks for children through use of these technologies and data sharing through these technologies but also to help us formulate policy recommendations for responsible stakeholders and in this case there are ministries of education, finance, consumer protection, data protection authorities and others. So I'll hand over to Sabine now to moderate the session and I hope we will have a productive discussion. Thank you.
>> SABINA: Thank you for laying out. First we will look at the collection of children's data in these three domains then look at the governance models and lastly at regulatory and policy frameworks so let's dive right into the first block and as I mentioned we want this to be an interactive session so after each block we will have a Q&A session. Emma maybe I can start with you as Jasmina was saying there's lots of risks and benefits associated with data processing in the context of these emerging technologies and maybe let's zoom into edtech and is the most obvious when you think about data governance and children and maybe you can talk about the examples where edtech may be used for good in the context of children. Thank you.
>> EMMA DAY: Yeah, there's currently a lot of debate particularly the benefits for teaching and earn willing. So when we think about data processing, any data that's collected from children must be both necessary and proportionate for this to be lawful under data protection law. So for edtech to be necessary it must be used in the educational process and where that purpose is identified it's not yet clear what benefits can be derived provide. From sharing with the school, from the government to analyze for more evidence based kind of policymaking I think there's still a lack of clarity around exactly what data would be helpful, what are the questions that we're seeking to answer with these data? There's much debate about the potential for personalized learning. And this relies on algorithms which learn from individual children's data and steer their learning to suit their personal learning needs and data from these kinds of tools can be shared with teachers and perhaps their teachers can identify early which of their students are falling behind particularly if they have a large class of students they may miss a student but if they have this an algorithm can show them which students are falling behind the rest of them and it may also help them to look at equity to ensure that girls, children with disabilities and those in rural areas are receiving the same opportunities as everyone else and there's some interesting projects looking at how children can have more agency so they are actually benefitting themselves and they're able to share their data for their own benefit in privacy preserving ways so for example in the UK the ICO which is the information commissioner's office has just started a sandbox project with the Department of Education and this aiming to enable children to share their education data securely and easily with higher education providers once they reach the age of 16. So I will leave it there. I'm sure there are many other benefits and we'll let the audience come in with more a little bit later.
>> SABINA: Thanks so much for laying out these benefits in the context of edtech and Melvin if I can hand to you and if you can talk about benefits and risks related to the fintech center. Melvin, over to you.
>> MELVIN BRETON: I think similarly to edtech you can really think about all these technologies that are enabling better data processing. As double edged swords, you can think about ways the most obvious way in which it benefits children is in enhancing financial literacy from a young age, right? The better, the more data, the better data collection that you carry out as some of these technologies are being used by children, you can learn about their min money habits and have nudges that they are overspending and they need to save or nudge them towards developing healthy saving habits.
And healthy spending patterns as well. Right? So the better the processing using emerging technologies, the better this kind of ongoing feedback and realtime feedback becomes. And helps kids develop a good money management skills. And you can think about using this data to develop purpose, purpose build applications for education in financial literacy. That's on the positive side. There are other many applications. If you think about the intersection of public policy in fintech you have the rights of the child establishes the right to Social Security and social protection. And there's a lot of applications of fintech in handing Social Security, social protection benefits and cash transfers in different contexts that are enabled by fintech and the better the data processing technologies become, the more efficient and agile the social protection applications of financial technology can become. We're looking at, in different parts of the world, issues with dwindling population and the future of markets and people are talking about universal basic income. How about universal child benefits? Starting there. And seeing how I merging technologies can enable us.
To make universal child benefits universal? And much more efficient. So on the that's on the benefits side, many more, on the risks there's always risks with exploitation. More opportunities for bad actors to target their attacks to children. Promoting on the weak side on the downside of better spending habits you can also promote children and young people, not just children, to overuse some of these financial technologies. Sometimes to their detriment and we've seen some alarming cases with, for example, trading apps, stock trading apps, causing mental health issues and harms to children. Or to young people rather and there's also manipulation by making children buy things they don't necessarily need. Making it available for them to buy products and services that are harmful and then there's the whole issue of facilitating addictive behaviors and through in app purchases and things like that. So we can get into either any of those more but I will just leave it there for the time being. Over.
>> SABINA: Thank you so much, Melvin for that. I think it's so interesting that we may not have thought about that initially children's data also has these risks and benefits. Thank you so much for Melvin, for laying these out. Aki, maybe you can share examples with these experiences from Finland. Aki, over to you.
>> AKI ENKENBERG: Yes, absolutely. And very happy to be here. Thanks UNICEF for inviting me to be part of the panel. It's quite a family issue that does require strong multistakeholder cooperation and the IGA is a really good platform for taking these issues, debate around these issues further. We also have to keep in mind, this is a point, that recently approved global digital compact puts issues around data governance for the first time firmly on the global development agenda and we should be also mindful of systematically including a child lens in these discussions. Going forward. But from the Finnish standpoint looking at what we've done nationally, a couple of remarks with a specific focus on the education or education system. We have long recognized that children and youth do need to be considered through a specific perspectives in relation to digital technologies, AI and data. This kind of perspective has been part of our kind of national thinking around AI policies. Data policies. And we've also worked together with UNICEF on these issues. Both on AI and data governance with important benefits for our national policymaking. The tradition has also been that we've had strong multistakeholder in place to uncover evidence, make informed choices, take informed action, et cetera, in our context. So this realization that children and youth are at the forefront from the point of view of evolving the use of technology is quite crucial especially in relation to social media. They often early adopters of new services but also potentially less mindful of privacy concerns. They're less informed about data rights, perhaps care less about those rights, et cetera, and in national policymaking there's often this tendency to really prioritize the potential and promotion of technology and national AI or data strategies, for example, in education for health but less focus on safeguarding rights or child rights. Specifically. Children and youth have faced with their own rights. That make it difficult to opt out, et cetera, and of course when we talk about young children specifically they're not in a position to make these choices in the first place so we have to realize to make them and we have the agency of children and youth to kind of regard them as active agents. To act. And this is also something we've considered quite important from the point of view of developing democratic citizenship also in Finland. So data and AI literacy as a first, has received special attention in our case. We've realized the need to update media literacy education for the data NINH. There's recent approaches for schools and teachers, et cetera. In this field. And the focus most of them, is on making sure the child rights are integrated in house, schools and in their daily operations. There are some flagship projects by several universities, also by our national innovation fund funded by Finland. There's a project called GENI that focuses on the data and, focused on data governance as well. But secondly besides this, there is this realization that this ongoing dataification of schools and educational settings call for certifications of technology.
Because when you look at what's going on in the private sector there's an increased focus on measuring cognitive processes, emotional response of children, behavior of them by them and this is focusing on student interaction and support learning method. But there's this tendency of growing a continuous data gathering where neurotechnology is also increasingly part of the problem it provides deeper insight in the processing of information, learning by children, but also raises the questions around how the data is governed. So as a response, our Finnish education is finishing guy guidance at the moment not only focusing on how they should learn but also what kinds of tools and services should be used by schools and teachers.
To ensure the quality and safety of digital content and services and to engage in regular dialogue with the actors involved in producing this content and services and as I mentioned at the beginning, the belief really is none of this can be done by the governments alone or our authorities alone. But through active cooperation with research community and tech companies, schools and parents. Thank you.
>> SABINA: Thank you for sharing the experience from Finland and you provided me into the perfect segue into governance models. You said none of the stakeholders can do it alone and I think that holds true for a lot of the topics but specifically for these new forms of data governance and you also mentioned the global digital compact and how it's also encouraging these small governance models. Maybe we can think a little bit about what data governance can look like for these three domains and of course when we think about data governance we think about the data protection authorities. But of course this much broader. I would like to hear more about the multistakeholder models that can be deployed to govern these frontier technologies and Melvin maybe I can start with you in the context of fintech, what are some of the governance models that are working in this particular space?
>> MELVIN BRETON: Yeah, thank you Sabina. I think with fintech it's particularly complex. Financial services are a very established area of regulation and fintech comes and adds the technological layer on top of that that creates intersections with social media and many other environments in which a data is being processed. So the it these to be multistakeholder if we're going to have effective governance. You can think about, there are some examples of public private partnerships. That allow companies to opt in to some sort of a data more advanced data protection regulations in the context of a regulatory sign box to see how that might work. And there are other sort of frameworks, like open banking, conglomerate s that allow better sharing between financial institutions.
And the government that you can also bring fintechs into to make sure all the information is transparent and complies with data governance regulations. So the challenge really is that as you develop these technologies, you are creating new tools and you are creating new data that may not be covered by existing either financial regulations or data protections and data governance regulation. And if you have a very wide ranging data governance regulation but there's the financial sector operating in a sort of in a separate environment where data is not flowing from financial system to the broader government, then you run into a problem where you have in principle, data regulation but you don't know what you don't know. Right? You don't know what information is being generated through the use of these fintechs necessarily that may not be may be covered in principle but go against regulation but may not be viz to believe the regulators on the data governance side.
And maybe not even to the financial regulator, right? So the multistakeholder model, since this is such an emerging and rapidly evolving area, we're seeing the successful use of regulatory sandboxes as I was mentioning before where companies can opt in to see how these processes of sharing information and sharing data can balance issues like privacy, governance and the effectiveness of some of these services and when it come to children, right now, we are seeing very little in terms of regulatory initiatives in fintech that take children into account, specifically. Mostly that's happening at the level of data governance regulations and that's where children are protected but fintech per se is not yet perhaps because the regulatory landscape is still mature ing it's not taking steps to protect data.
Related to children specifically. So that's something that we would like to see open minded conglomerates, public private partnerships, regulatory sandboxes for fintech companies to opt in and work closely with the government to see the intersection of data governance regulation, financial regulations and fintech generated information and data in the future. So I will leave it at that. Over.
>> SABINA: Thank you so much I think we see this as a complex issue and the more we dive into it the more complex it gets and I think you highlighted the importance of the sandbox and also the importance of public private partnerships in this context and of course one player that is very important especially also at a forum as the IGF here in civil society, many context civil society is upholding the importance of human rights and children's rights in this context and Emma maybe you can tell us a little bit more about what role do you see for civil society in these various multistakeholder models for data governance for children?
>> EMMA DAY: Yeah great question and before I get specifically to that I want to go back to the issue of regulatory sandboxes. These come from the fintech sector as Melvin is describing but as part of this project on data governance for children, that UNICEF is leading at the moment, we're producing a series of case studies and that will look at the role of sandboxes in data governance for children and I think these are a promising model or stakeholder governance that could have great potential for the education sector now we see they usually are used a little bit more narrowly by regulators. So often data protection authorities will put out a call for applications to the private sector and private sector companies will then work with the regulator on some of these kinds of frontier technologies like edtech or fintech or perhaps neurotech where it's not clear how the law or regulation applies in practice because this is such a new technology and then there is a set period of time and there's an exit report which is publicized usually so that other people in the sector can learn, other companies with learn what are the boundaries of regulation. And the regulator can then learn how they maybe should change that regulation and move as the tech moves also. But I think what's most promising is what we have seen there is an organisation called the data sphere initiative and they're looking at the role of regulatory sandboxes much more from the stakeholder perspective. Looking at civil society as the missing piece working together with regulators and with the private sector on these big questions about how to govern these frontier technologies. Who is still missing though is involving children. We haven't seen an example yet of a regulatory sandbox. There are some which are about children but not which involve the participation of children. And the other I think innovative aspect is promoting is they're looking at cross border sandboxes as well so many of these tools like edtech tools in particular are used across many different countries often they're multinational companies and so there's it's really not a question for one regulator.
And it's better if these countries are inter operable and work together. And also involve civil society as much as possible.
So I think this is not yet happening to our knowledge within the education sector but it seems to be a very promising model for the future.
>> SABINA: Thanks so much Emma, lots of potential and maybe let us pause here for a second because I think this was also a lot of content and if you were listening to Emma and wondering what is a regulatory sandbox also you see, okay, so maybe before the first block of Q&A you explain what a regulatory sandbox is.
>> EMMA DAY: This is usually between a regulator. It's a data protection authority actually. Because they're usually about data processing and so the data protection authority, wants to work with the private sector to explore how the regulation should be put into practice. So if you think about in an example from edtech, say there was a new kind of immersive technology that suddenly became available for education where children could become avatars and they could put on a glove and feel things, there would be some risks and some benefits and maybe the regulator would want to explore those with the company. There's trust where the company is worried that they will bring an enforcement action against them and so within this sandbox it's kind of a protective framework. Where the companies can explain the technology they're exploring and the regulator can have an interaction with them and tell them if it's lawful or if they're going to end up in a risky area. It's still, in most countries, regulators will not allow the company to experiment with something that is not lawful or prohibited by regulation. But it's a way for usually a project that's still in the development phase to get the guidance from the regulator on how to navigate that space forwards. I hope that makes sense.
>> SABINA: Yeah, before you unleash technology own lots of people let's try from a compliance perspective what can we do to avoid the severe impacts to then strengthen compliance once the product is on the market.
So let me stop here and you can ask another question if that wasn't clear. Let me maybe give the opportunity to people in the room. On the first two questions regarding technologies, multistakeholder models any questions from the floor at this point in this time. Do we have a mic? Can I take yours? Thank you so much, thank you.
>> AUDIENCE: Emma, you mentioned regulatory sandboxes. Have you seen which countries or regulators are great examples to follow?
>> EMMA DAY: From what I've seen which includes civil society the focus has been in Africa on health tech and there have been cross border regulatory sandboxes that the data sphere initiative has been coordinating.
And so the data sphere initiative is a third party which maybe also makes it easier that it's not the regulator that is leading the sandbox.
The regulatory sandboxes that we see more within Europe are generally more just the regulatory with the private sector.
But if anyone has any examples that they know of they would want to share we would love to hear more about those.
>> AUDIENCE: Hi my question goes to Malcolm probably it's also interesting for the person that was talking about edtech. I just think that the data of children in the fintech sector are of huge interest because they will be the customers of the future and we've been talking about privacy but what about security of these data? How do we make sure that these data are not exploited for any purpose that we don't want them to be? Thank you.
>> SABINA: Medical Melvin if you want to start.
>> MELVIN BRETON: If we knew how to prevent these data from being exploited we would probably be doing it already.
I think there is an intense tension between innovation and development of new technologies and new applications in the fintech sector. And the protection of data related to children. It's also not clearcut because a lot of the use that's financial applications is not necessarily happening in fintech apps but it's happening in social media apps that have payments enabled or where you can purchase certain items, it's happening in games, where you have in app purchases and a lot boxes and all of these things that you can purchase within the game.
And that don't necessarily require multiple instances of approval from a parent. So you set it and forget it in a way and then you have the payment form you have and then you run with it and then there are a lot of transactions that are being carried out by children in platforms and apps that have the parent's information data and, like, you can think about online shopping platforms where children often have access to their parent's account to purchase this or that item. So the that's to say the information that is generated and collected about children and then generated from children in financial applications and financial technologies is scattered. I think regulatory sandboxes for fintech applications are a good first step to see how we can develop ways of collecting that dedicated information that's being generated in the context of the fintech apps and services. We'll see how that develops. Then there are, as I was saying the other financial applications of technologies that are not necessarily fintech apps. Where the conversation is part of a broader conversation related to the data that's being generated and used in those other applications like I mentioned games and I mentioned social media. Right? So there's currently the debate about the kids online safety act in the U.S. I don't know if there's focus on the financial aspect. How can we pay more attention to the financial transactions that kids are carrying out, outside of dedicated fintech apps at the same time as we use regulatory sandboxes to try and regulate that within the dedicated fintech apps I think that's going to be a big question. And that's to not even mention crypto blockchain the centralized finance which is perhaps another can of worms. I will Lyft for now.
>> SABINA: I think that was important. Some of you might have wondered how often does a child make a bank transfer but you were talking about how it is embedded in typical digital environments where children are engaged. That was an important point and to think about our data processing and all the problems that come with it. I have two hands. No, you don't want to? I'm sorry. Okay. He's like, please. Emma, please.
>> EMMA DAY: So I wanted to come back to this point about super security which is a really important point, a big part of this discussion, what we've been seeing, we've been interviewing regulators around the world and in every country it's common that at a school level there's a big security breach when children's data is leaked. When we're talking about the benefits of sharing all of this data it's not really something an edtech company can necessarily, the problem may not be with them but with the school or the government in terms of the cyber security they put in place so we need to, there's a big part of the picture to enable it to be a safe and trusted environment to implement these new technologies.
>> SALEM FADI: And I guess that comes with accountability for all the stakeholders that are involved in the deployment of these technologies and clear roles of who should be held accountable and how. So any other questions on these topics at this point in time from the floor also online? I don't think we have any questions online? Any other questions from the floor? No? All right let's move onto the next two models. We spoke about governance models and we can't say governance without saying law and regulation. So let's look at that next. So when we are looking at these kind of merging technologies of course the classic conflict comes up how does law and regulation keep up with that, technology is changing all the time, children are changing all the time so how can we address these and maybe Steven you can tell us a little bit more, what do you see in the context of the legal and regulatory framework. And Steven maybe before you go into the regulatory context explain what neurotech is and how it impacts children.
>> STEVEN VOSLOO: Thank you, Sabina, not everyone knows what it is. It's any technology that looks at neural or brain signalses and the functioning of the brain or the neural system.
So it could record those functions, it could monitor those, it could modulate or even kind of write to, I'm a computer scientist so I think of writing to the brain data and make some neural changes and so it could impact children in many ways. I'll talk a little bit later about neurotechnology in the classroom. To help monitor levels of concentration, for example, and so that's kind of monitoring brain activity. And we see examples of this in some classrooms around the world. So just one other thing on that, the technology is either invasive or noninvasive. This is very severe neural disorders like quadriplegics that have a chip implanted in the skull on the brain and with the thoughts they can move a mouse or communicate or interact with supreme court computers.
The other side is noninvasive and this is actually where the space is going to probably go more and impact children more so this is less accurate than heavy clinical invasive side but it's also less invasive. It could be a head band that you wear and it could look at your levels of concentration or so forth so you asked about the laws and regulations. Neural technology is not advancing in a regulatory void or vacuum. We have existing regulations and laws including on the rights of the child the question is do they apply to this frontier technology? And so we see, for example, in the UK, the ICO which is the data protection authority, looking, has done some research into looking at existing laws within the UK to see if they provide cover for new technology and they're in their investigation phase. And the same is happening in Australia. The Australia human rights commission has been investigating, you know there's the existing regulatory framework covering neural technology so then what is the answer and we have been in two camps. In Europe, for example, the European Parliament also did an investigation and basically found they think the existing laws and frameworks do provide enough cover. So there's the EU charter on fundamental rights of freedoms. And there's the European convention on human rights and what we know in particularly the GDPR, and broadly applies to neural data. ` There's also the European AI act.
Which doesn't speak about Neurotechnology directly but looks at motion detection AI in the workplace and in the classroom.
And that would often be captured by neurotechnology and I also should have mentioned a real convergence of technologies because that complicates the space. It's recently that it's made advances and that's in part due to advances in AI and the ability to process large amounts of data that's getting captured. So other countries have said no, the existing laws don't provide enough cover, they need to make some changes and these especially come from Latin America, in Chile for example there was a constitutional amendment that picked our neural data and there was a world first case recently or fairy recently in Chile where they bought the product and said they're not happy with the terms and conditions. In the product where you don't know where the data is going to and that went all the way to the supreme court and the supreme court judge that the neurotech company needed to cease operation and to address that whole. Mexico is introducing a law that a broad law that will result in 92 new article because they these are health laws, these are across a range of sectors, didn't think that the existing space provided enough cover for novel kind of issues around neurotech. And then lastly in the U.S., two of the states, California, and Colorado, have updated their data protection. The kind of personal data privacy data, protection, regulation, to really pick out neural and brain data and there the FDC which is a consumer protection body has gone off to some companies more actually for misrepresentation where companies say this product will help you to read your brain data and help you do X and it can't really it's all too rudimentary and so it's misrepresentation. So I will close there just to say that some countries feel there's enough cover. Others don't. And it seems to be landing in different ministries. Our recommendation is all countries should do a mapping exercise to look at what is at the national level and look at the opportunities and risks and emerging use cases from neurotech and whether there's sufficient protection.
>> SABINA: Thanks so much for that explanation and different examples. I think you were also speak about convergence of technologies and I think it's also convergence of regulations that we see, right? And how they can be applied and what gaps we have and I think at the practical level who is going to lead the laws and there's the Ministry of Health is it a communications regulator R these three all working together in you mentioned in the EU the AI act and how does the AI act apply? So I think it's exactly that. It's the mapping exercise first to really understand how these regulatory mechanisms all interact. Emma, maybe to say, okay, if we recognized there might be some gaps even though we might look at convergence of different regulatory frameworks and we pull everything we have together we still have gaps. Are we going to fix this?
>> EMMA DAY: I think to find the gaps. In edtech I think it's less about gaps and more about implementation. I think you can have gaps in putting it in place but I think a bigger gap in terms of implementing the regulations we already have. So in the context of edtech. It's generally still to do with data protection. And perhaps AI regulation. Of which we now around the world have quite a lot of regulation. Maybe if we're look to the future, there will be neurotech embedded in the edtech and it's going to become all the issues that Steven raised. So but I think that's where we, at the moment need to do the work is on the implementation and if you think about tech, education in many, many countries around the world is a devolved responsibility. And when it comes to the edtech problems in schools it's often the teachers or school management who will choose what products will be used at the school level and they need guidance to be able to make these choices. They have to think about is this a good tool for education? What about data protection? What about cyber security? What about, AI ethics? I think they were talking about in Finland this kind of guidance. Some of the key kinds of tools that can be used for this are procurement rules where governments decide that if schools are going to procure edtech to use in a school then they need to meet some requirements for data protection.
They can be like Aki was mentioning, certification seems so that an edtech company has to be audited and they certify they meet these minimum standards, industry also can create standards and there can be guidance and codes of practice and we know that some reinglators are starting who orange on this for schools.
But this is really an emerging area and I think it's a gap everywhere that maybe there's also room for every regulator doesn't have to start from the beginning but there can be common themes. For example the global privacy assembly has been working with UNICEF with different regulators from the world are coming together to look at what the common challenges are and solutions can be as well.
>> SABINA: Yeah that's a very important point usually we think there's a regulatory problem and you think we need a neurotech problem specifically.
That's not the solution because it is usually that there's the application and what you said around procurement rules and I think look at these aspects of edtech.
One of the things will also be too for example you need to conduct a data protection impact assessment for schools to really actively think about and to point them to the risks associated with the edtech and also as you mentioned the kind of joint thinking through bodies like the privacy assembly and others. How we can move forward in these kinds of spaces. I see a question, please come in. Sorry, can we have a microphone?
>> AUDIENCE: Yes, I just wanteded to refer to Section 508 in the U.S. law.
Which was introduced I do think twenty years ago making accessibility preconditioned for any procurement, and if we would have that for all the technology that we have been talking about making assessments.
>> SABINA: The problem is much closer to the people dealing with it. It becomes a procurement issue and a procurement issue is what schools deal with and they know procurement and rules around that. So if you bring the abstract issue of data protection down to their level it's much more likely that people will think about it. Thanks so much for that point. Any other questions on this particular block around regulation, what's your experience in your country around that? Do you see frameworks or gaps? What might be required? Any points from the floor? Otherwise, any other examples? Yep. He's moving, very good. Go ahead, please.
>> STEVEN VOSLOO: This isn't actually an example but it's more just to say how challenging this space is. I really like that point Uta about bringing in a condition for procurement and in the U.S. the government is such a mess of edtech, that really has teeth and can move the needle. This is more a challenge on your last point just talk to me about convergence, the thing that governments do so badly is work outside of their silo. We all do it badly even with within departments of UNICEF.
It's a real challenge to all of us. When you get issues of data governance that there are education issues, a health issue, that is a data protection issue, so it's going to challenge owl all of us to think outside the box and the silo.
>> SABINA: Emma.
>> EMMA DAY: Just another challenge I say is I think there are different channel challenges in different geography s in the world.
I think equity is a big children. You talk to some regulators and they're trying to make sure that every single person has access to education and has access to the internet and if you're talking about immerse ive technologies there's not the infrastructure around this.
And for many regular lators don't have the resources to have that kind of oversight.
Often over foreign companies who are deploying their products in their country possibly financed by development aid as well it becomes quite a complicated picture so I think that is where we also need to look at this multistakeholder governance model and think about the actor s who we need to include. This may happen from a do moor as well so there are different actors who need to be brought into these discussions I think.
>> SABINA: There is a competing interest, right? And I think from my appearance what I've heard is like data protection issues, yeah, there might be risks but it's really not something that we can divert because there's so much more tangible issue is access. And this loops back to the problem that children's data governance is not something a lot of people really see and not understand. I think that's why it's easily pushed aside rather that really considered within the CRC as an also equally competing. Oh, yes, please interrupt me at any time. Go ahead.
>> JASMINA BYRNE: Thank you so much, I was just listening to this discussion about regulatory frameworks.
And myriad stakeholders and I wanted to say sometimes these policies or strategies that come from different divisions, departments in the government and so on could also help us advance any potential work on data governance and I'm now thinking about digital public infrastructure that is actually an approach being departmented by so many countries which actually simulates the government services and the layer of platformed set up on this digital public infrastructure which includes income payments.
Any closed data sharing and digital IDs and when different governments in collaboration with private sectors are developing these strategies this is where we also need to be vigilant to think about how these data sharing practices can impact at all levels. There are currently about 54 of such strategies in place and there's a big push for an adoption of digital public infrastructure across the world. So to answer to your question, Sabina, where are the good examples I think we probably need to look much more closely to see how to engage with those stakeholders who are advancing the PI in their countries and regions to think about data governance as well across different domains. Thank you.
>> SABINA: Thank you so much. We have another question in the back. I think we don't have a microphone, thank you.
>> AUDIENCE: Thank you. Just building on from what Jasmina has just said and following on from what Emma said as well. When we, you know, where are the best practices that's important. Another area that I want to emphasize is, you know, there is operational activities like skill capacity building when it comes to educators, right? How do they know what is what does good look like? And then when we look at the strategy that's at a different level altogether that we need to think about, so it's I think I don't have the answer. But just an observation. And in different parts of the world. As I come from Australia. The Australia has been strong enough to advocate you know child rights and standing strong against metabut it's not all countries who can do that. So it's an interesting or challenging area.
But I think an area that we all have to collaborate together so that I can I that collaboration piece mays a role a very strong role as well as where are the best practices. Thank you.
>> SABINA: Thank you, Emma do you want to speak about this?
>> EMMA DAY: I think what you're saying is right and it comes book to the back to the process of resources.
There's no tech company operating in its country. It's just impossible really but I through that's why then we're looking at innovations in data governance to try and say what are some examples of how you plug those gaps? So with we will publish next year there'll be a UNICEF collection of innovations in data governance for children and there are some examples we had the regulatory sandboxes but also the certification schemes. So certification schemes are generally led by a nonprofit or even by a company themselves and it's a way of, I suppose, outsourcing some of that oversight and you always have attention because you can get a the commercialization then if the certification schemes so it has to be done properly and we're trying to look at some examples and this case study will tray to look at some of the considerations. It's quite difficult to find shining example best practices. We often start looking for those and then e Wednesday up looking at promising practices and take a little bit of what seems good from different examples so I this in these case studies I think we will look around the world and if anyone has any ideas along these themes want to contribute we'd love to hear from them and the other case study we're looking at at the moment is on children's codes so looking at there is a UK age appropriate design code. (No audio).
>> MELVIN BRETON: I think it's back, it went out for a little bit. And now it's back. Could I come in very briefly?
>> SABINA: Yes, salute, please.
>> MELVIN BRETON: In the theme of authority, right, we have all these different domains and we have one issue that goes across all of them which is data governance and data regulation. I think something that could be toward is empowering the regulatory body's data regulation, authorities a lot more within the government because you, if I'm thinking about fintech, you have very strong financial regulations in many countries and financial regulatory bodies. It's not so clear that they look at the advice from data governance authorities but those data governance authorities often have such a wide remit that it's very difficult for them to give direction that is tailor made for areas like fintech. So encouraging collaboration from the financial regulatory bodies with the data protection authorities to develop more tailor made regulations on data governance for fintech, for neurotech, for edtech, whatever the case might be, might be a good step and once those are well established making them more binding. It's one thing that it's regulating fintech but they may not be applying towards children's data which is the norm which is the data needs to be encrypted, needs to be anonymised but beyond that it's not super clear that the data protection regulations are very specific to children's needs in all these domains across all of these domains. Over.
>> SABINA: Thanks so much, you want to add something, Emma?
>> EMMA DAY: I think it's interesting the enforcement side of things and different regulators have different approaches to this. Some see them as being collaborators with the private sector who are balancing this approach of promoting innovation in their own country and tech ecosystem and also making sure that the tech companies don't overstep the mark too much but often from that perspective the regulator will meet with the companies and kind of warn them verbally first in other countries t more about bringing enforcement actions.
And they're not very approachable and there are pros and cons to each. In other countries particularly like we were discussing before where it may even be a foreign company that is the problem in the country. There are few resources and it's very different difficult to know actually how technically this would happen, where would be the jurisdiction, how will they hold this company accountable in their own country? There are definitely issues related to enforcement and accountability as well which deserve a whole other case study just to try to unpack.
>> SABINA: Thanks, this was a rich discussion. What does a gap look like? Do we have a gap of convergence of technologies, of regulatory frameworks, implementation problems and Emma what you said about best practices, promising practices and maybe only practices I think. Yeah so we're changing the bar as we go. It's a learning space and we need to think outside the box, all of us. Maybe after looking at the recent benefits, governance models, laws and regulations, maybe that was very much looking at the status quo maybe we can close the session by looking ahead a little bit and look at the next ten years. Fifteen years. And these different technologies, edtech, neurotech and fintech and what do you think about what might be the upcoming issues in terms of data governance because of course we already need to think ahead, predict things and find solutions as we go forward. Maybe Aki I can start with you. Just some concluding thoughts on that.
>> AKI ENKENBERG: Yes, thank you, I think it's been a very interesting discussion so far and already I think many of the issues related to the future of this fields and how thigh should be or could be governed have come up. So maybe we can also build on those.
In this final segment. I do agree with I think Steven who raised this issue of convergence earlier. Which makes it quite difficult to predict or make predictions. About where neurotech or edtech or fintech will evolve or go when the next 5 10 years because they interact and merge with each other. Out of those combinations, different fields will emerge. Different problems will I merge and so on. So.
Definitely that's one key point to watch. Secondly, you know, we can think about technology on its own. And often it's very useful to kind of make these kinds of predictions but we also should keep in mind that it doesn't evolve autonomously. It's also governed and constantly being steered. By governments and other stakeholders in the process. So we should definitely also think about at the same time whether we want the technology to evolve. And how we can be part of the process. On neurotech, quite an interesting field, I think we'll see a lot of unexpected things over the next five years even. In addition to these sleeves in kind of measuring brain activity or neural activity definitely there'll be a growing focus on the acting on humans acting on the brain or stimuli in the brain. New interface is also for. How these technologies eventually come to schools and classroom to monitor learning of behavior but also to stimulate learning and certain type of behavior is quite interesting but also quite controversial I'm sure. The downside also from the face of neurotech and AI. This risk of unconscious influencing for political purposes, for commercial purposes, for marketing advertising or changing people's minds, influencing them when their brains are still evolving in the case of children and youth extremely, extremely important to keep in mind. As Steven mentioned UAI act already recognized this danger and definitely when it comes to regulation at this point in time it seems to be wise to focus on the risks posed by specific users of technology. It will be very difficult to govern or sort of prohibit certain technologies or allow other technologies per se but it will be possible to govern how they are used and applied. In the EU AI act is a good approach of this one. On fintech, finally, definitely in my mind at least there's kind of a financialization of everything. Embedding of financial services of financial angle in every other type of visual service we can see more gains or entertainment social media, et cetera, so definitely moving from this situation where fintech, we regard fintech primarily as new means for making payments, saving, investing, in the future also more and more about lending to a world where financial services will be part of every other thing we do. Of course combined with the very likely scenario where everyone will be quite easily identified online also through digital identity systems. Is KYC, I know your consumer problem will be less important than it is today. People can be recognized online. Their identity is known and they're conducting financial transactions or so on. Everywhere where they go and through different means if only through specific apps or banks and so on. And then finally, definitely, we'll move into a world where not only our kind of behavior and choices and actions that are visible will be measured and tracked but also our sort of bodily activity and brain activity more and more. And this will become a focus for data governance also. And when we think about how AI is also developing, we're trying to create this independently acting AI agents that are currently sort of learning from what exists, the data that exists and is available online but in the future, they will also, I mean, there's a need for them to also, these systems to learn from humans directly from the activities, behaviors and their thoughts and so on. So our data, our bodily data, our brain data, will become commercially crucial or important for this endeavor. So this really highlights the importance of personal data and bodily data in the future in data governance and then finally was it Jasmina who mentioned this issue or Emma on the global divide.
So as we're in the global north we're trying to keep pace with technology and also tackle some of the issues we see. We do have to keep in mind this need to develop a level playing field globally and really to address not only the technology divide but also the regulatory divide so these are my thoughts, thanks.
>> SABINA: Thanks Aki, Steve, good luck following that. Any thoughts?
>> STEVEN VOSLOO: Thank you, Aki, that was excellent. I don't have too much to add. Aki eloquently highlighted the technological use cases but also the broader issues. Maybe I will just pick out one quick thing. So, on neurotech anyways, this move from neurotechnology beginning in the medical space that's highly regulated. And has ethical oversight now moving into the consumer space and in many countries consumer electronic devices, that level of oversight, right, so there's clearly a gap there. And there from a data governance and just protection perspective there's a huge area to focus on. But in terms of where the space is going in the consumer side anyways we will definitely see in the education space that's come up a lot and this isn't just me speak, this is through consultations we've done with neurotech experts from the around the world. In the classroom to kind of support learning, for, you know, and the opportunities and risks that comes with that. But in the home space the cognitive enhancement is also an area to watch. This is not an area where you have a neuro disorder where you get treated. This is where you are healthy but you can perform better. And in our consultations people from certain countries that are highly competitive well really you pull all the levers you can to advance your child whether it's through tutors, you look at all your options if neurotechnology promises that that will look at that. How do you compete in the global south against your peer in the global north who is just performing so much better? So, that kind of touches on the not just treatment but also enhancement in one of the consultations one of the folks said something that is really great, he's from Zimbabwe and said you may get a future world where you get those, the treated of neurotechnology for disorders. And you may get the enhanced who are healthy and then we added in the group the naturals. And this could be the future. We'll leave it on a controversial note.
>> EMMA DAY: I think mine will be controversial in a different way. I would like to go back, there's obviously a trajectory of the development of technology but we are governing how that continues into the future and I think sometimes there is a kind of inevitability in that we hear about the direction, the technology will evolve in and that we're all going to end up with chips in our brains but I think these are decisions that we will make and we can decide what is in the best interest of children for their education, we can put the guardrails in place and maximize some of the benefits that are being promised here but also we can decide not to end up with chips in our brains if we don't want to. They're really extremely the end point I think there's also, just really focusing on edtech, I think some of it is also to do with geopolitics of how this develops. We're seeing at the moment quite a monopoly by American and Chinese tech companies. There are some big, a couple of big American tech companies who deploy their edtech kind of more infrastructure around the world really and then at a national level you see in most countries in the world there is an ecosystem growing now of apps that plug into those big company platforms for things like language, mathematics and they're more culturally and linguistically appropriate and maybe those ecosystems are going to grow more and you'll also see within Europe there's the Gaia X project which is led by the German goth and the aim there is to try to find European level solutions based on secure and trustworthy exchange of educational data so that they don't have to use the big tech companies for edtech. So it depends on how all of that plays out and we don't really know what direction that's going to move in but it's likely to have an influence on the kinds of technology we see and the values that underpins the values of those technologies as well.
>> SABINA: Thanks very much. Melvin?
>> MELVIN BRETON: Is a Sabina, thank you. More problems, no, I think it's useful to think about it in terms of the extensive future and the intensive future in terms of fintech I think Aki already eluded to some of the future in the sense I'm using it here. We're seeing inif he can across an increasing range of domains.
We started by just having a web or app layer on top of financial services and now we're seeing it getting into gaming, getting into social media, where there are obvious financial applications there that are relevant for children. That we have not yet completely come to grips with in terms of regulation and data protection beyond maybe encryption which is still not even apply across the board but we know those two things are important. But then we're getting into other things like the metaverse which is maybe an extension of games. To just social life in a parallel world that will also inevitably be transactions and we're already seeing things like NFTs and digital land that you can purchase and what kind of implications does that have for children and data? And you're seeing also in social media that there's financial transactions are becoming public and another source of information about the lives of children that is becoming more prevalent so what are we going to do about that? I think those are very much open questions. Not to even mention neurotech. Which is I think a scary to think about the prospect of neurotech and fintech but something we need to keep in mind nonetheless. I think there's age detection through AI for purposes of educating is getting a lot better. I think now companies say that they can detect a person's age to plus minus one year. Roughly. Just through their the use of AI. But then that opens the question, the question what else does it know about you in terms of your financial life and the transactions that you are likely to make and what potential does that open for manipulation and exploitation of children there's also the AI and fintech intersection front. The algorithms are getting a lot better, for example, for deciding who to lend to. Banking services, using that to be able to process more applications for loans and things like that. That enables more financial inclusion for families that maybe didn't previously have access to services. The technology itself allows people to, or the technologies themselves allow people to be more integrate blood the financial systems so that is a plus for financial inclusion but then those same tools if you are thinking about AI or machine learning algorithms. Used to the site who gets in or doesn't get in alone can also have another edge that can lead to financial exclusion because it's a lot easier to see who has the risk. The quality aspect here is important. Also to mention that whatever applications that require connectivity will just compound digital divide that already exists so something to think about there. On the positive side of applications I think social protection and transfers, cash transfers, for social protection are going to benefit immensely from these new technologies that are becoming more efficient, fewer data requirements, more points of entry and as things like central bank digital currencies, stable coins and things like that. Because more prevalent it's going to make it a lot easier to expand and scale up social protection systems and transfers again with the caveat of, you know, we need to be conscious of the digital divide. And then on the education front. Financial education is going to become sorry, just, wrapping up.
>> SABINA: Yes, thanks.
>> MELVIN BRETON: Financial education is going to allow for a longer financial life. Starting earlier in your financial journey and becoming more savvy is going to be something that's going to be beneficial for children but again, pinch of salt that you need to be careful about the risks. Over.
>> SABINA: I love how you brought in the metaverse. Yes, Emma Jasmina, answer all the questions in the last two minutes.
>> JASMINA BYRNE: It's been a great pleasure listening to all of you and so many fantastic contributions and ideas and we talked about integration of technology and multitake holder approaches to these issues and when we talk about the future we need to think about how these technologies are going to be evolving as the tech is already much more mature that the challenge is going to be on the size of the market, how do we capture everyone who is introducing some edtech tools to the market, but also piloting of new technologies that is happening. How are we also trying to work with those companies who are testing and piloting new approaches and new technologies. In the financial sector from Melvin also that includes integrating blockchain, crypto and so on, and basically AI integration into everything that we are going to be seeing more and more in the future. I think what is going to be a big challenge for all of us is the global fragmentation of regulation which can lead to uneven safety standards and standards for children in particular. I think that fragmentation can potentially lead to a lack of trust in these technologies. And their adoption and application for good as we said in the beginning that there are so many benefits. So the question is also for us. Who are working (captions will end in two minutes). is how do we each shape the future of technology how do we use this knowledge and this understanding of implications for children to shape the development.
And someone was mentioning also standards for or recommendations for even procurement of some of these technologies and maybe going even back towards the development of these technologies and integration into child right's principles into the development of these.
We also need to to think about the future of regulation.
What will be the future approach to regulating technologies and how do we strike that balance (one minute left of captioning) ensuring that future regulation of policies actually accurately in a way create that balance and maintain that balance and allow for innovation while at the same time safeguarding children and I want to end on the child rights note. We haven't mentioned so much children's rights. Many of you particularly online have worked over the past several years on really integrating child rights into any kind of tech policy. And we heard from Aki the opportunities under the digital compact tool to integrate more effort in relation to children's data governance. So I would just like to remind everyone again, that children's rights are.