The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> MODERATOR: We are going to start in one minute.
>> MODERATOR: Does it work?
>> JUTTA CROLL: Can you please bring the Zoom on the screen? Not the slides. Not yet, please.
Okay. It's the top of the hour. Welcome to Children in the Metaverse. My name is Jutta Croll, I'm the chair woman you can't hear? I'm the chairwoman of the Digital Opportunities Foundation.
Can you hear me?
Yes. It's only Torsten. So he will need a new headset.
Okay. Some technical issues but that will that will be solved in a second.
and we start again with our workshop Children in the Metaverse. Welcome to those people who are onsite in Riyadh and also those taking part online. As I said before, my name is Jutta Croll and I'm chairwoman of the Digital Opportunities Foundation and I prepared this workshop proposal together with Torsten Krause and with Peter Joziasse from the Netherlands.
The immersion of digital media and everyday life has reached a new level of development.
Although the concept of virtual reality dates 35 years back, like the World Wide Web and like the children's Convention. So we have a coincidence of certain developments but nonetheless virtual reality has only come into our lives during the last several years and what we already know from 20 years of children using the digital devices and being online on the Internet, they are the early adopters and they are those who will also be the first inhabitants of the virtual environment. So that is why we have come together to consider how children's rights can be ensured in virtual reality, in the Metaverse, and I'm really happy to have those esteemed speakers around me and also online. I will introduce the speakers to you once it's their turn to speak and we will begin with Michael Barngrover. He's already to be seen in the Zoom. You can see him.
Michael is the managing director of XR for Europe which is an industrial association with a mission to strengthening collaboration and integration between our industry stakeholders across the European continent. So he's coming from the technical part, but I know that he has children's rights on his mind and I will hand over to you, Michael. Your screens are all ready to be scene, your slides are all ready to be seen. Just start the presentation. Thank you.
>> MICHAEL BARNGROVER: Thank you very much. Just to confirm that everyone hears me all right?
>> JUTTA CROLL: Good.
>> MICHAEL BARNGROVER: Okay. Then yes I'm the manager director of XR Europe and we support those who work with XR all over the continent of Europe, both researchers as well as companies and policy makers, helping and hoping to make a future virtual world of Europe one that we want to have happen, that we want future generations to be healthy and productive. So we can go to the next slide. I'd like to focus on scoping virtual worlds.
the next slide I don't have control over it, do I? No, I do not.
>> JUTTA CROLL: Can you move next slide?
>> MICHAEL BARNGROVER: Thank you. Virtual worlds, it is a challenging term because it's very broad and very encompassing. So there are at least four broad concepts of virtual worlds that I think are very relevant. The first one being VR, virtual reality, so you're completely immersed and taking place and sharing place often with others in a digital environment, a fully digital environment. Of course, it's not fully digital because you are there and you have a digital and non digital existence.
Then we have this mixed reality which is becoming more prominent with new devices and headsets that can do this and almost every device going forward will be increasingly capable of mixed reality which brings the digital and non digital together in an integrated experience which means in the same way that you are present in this digital environment, this digital medium, you're still present in your physical interactively. So you're having to interact with both.
but both of these are built off of another kind of virtual world which is much more mature and much more common which is basically your traditional 3D gaming and even 2D gaming. These images here are Fortnite where hundreds of thousands of millions are active, including children. So this is a space where we start to think about the future and really where we're actually concerned about right now. The lessons we can draw from making these environments safe and healthy, we should be able to draw lessons from what we're trying to do today to future worlds that will be mixed reality and even virtual reality. Even a lot of these 3D worlds follow a lot of the experiences of 2D, and non immersive environments, like even social media platforms. Social media platforms are virtual worlds as in they're populated and active and interactive, but they are not virtual. They don't have a direct correlate with the physical world except through us, the users.
Can I go to the next slide? Sorry. It's back a couple slides, the cognitive loads of multipresence. A couple slides before this one.
Sorry. One more.
>> JUTTA CROLL: One more. It works, Torsten. One more. Just click one.
>> MICHAEL BARNGROVER: Yeah, one slide before this, cognitive load.
>> JUTTA CROLL: One slide before, please.
>> MICHAEL BARNGROVER: Yes, this one. So there is a cognitive load to be thinking about and managing your activities and your presence, and thus the activities and presence of others, in multiple worlds, the non digital world, but also the digital virtual worlds. When we have many virtual worlds that we're active in, that's an additional load. So already it's something we're doing right now with social media platforms but also true of 3D platforms that are persistent. The traditional gaming platforms are always there. Our friends may be there. When we're not there these worlds are still active and we may still be thinking about them, particularly children who generate a lot of their social equity through their activities in these platforms.
As these become 3D immersive spaces it's actually more challenging because there's more to think about. We have to think about as we move around the space, the physical space as in maps correlates with the space, for example, a mixed reality of the environment, the table, the chair, those things are in the virtual environment. Whether they are represented there or not they're in my physical space so they're also in my virtual space even if they're not virtually represented there. So even from a safety perspective we have to manage these things but when we start putting more people in there that becomes more complicated as well.
So cognitive load for everyone, but especially children, is something to really be concerned about.
Can we go to the next slide please? The avatar slide. When we talk about being in this world together we have a question of how we're viewed there. I'll go a little quicker because I'm becoming a little over time for my part but the avatar is just to say avatars come in many different forms and it's not possible for any one company to provide you all the ranges of representation here, whether cultural or even age and ideology.
So there's a big question between whether people should be able to make their own avatars and bring them into these virtual worlds or whether they should limit themselves to the options that are provided to them and these, of course, have tremendous consequences particularly to younger people as they start to spend more time building social equity while using avatars representing themselves to others. Their avatar does not mean that's what they are; they have to choose with peers about their identity. So avatars are just a tool in the virtual world.
Very quickly, once we're in there and represented sorry, a couple slides back there.
There's a slide about policing.
but basically once we're in this environment it is still something of a free for all environment in virtual worlds because we do not have something akin to policing of criminal justice. When crimes happen right now in these virtual environments right now it's very hard to arrive at a consequence, an acceptable consequence for them, but it's also very difficult to know what is a suitable consequence. A virtual crime is a crime by definition but we don't know how much it's the same or similar to the same crime that would take place outside of a virtual realm. This is not unique to virtual worlds, just like social media, it's something we're trying to legislate and understand.
It's close there. Thank you very much.
>> JUTTA CROLL: Thank you, Michael, for giving us our first insight as to what virtual worlds can be, will be, or are already.
We will now go to our next speaker to tell us about technology and the Metaverse. Deepali, are you there? Deepali is sitting right besides me. Deepak, are you there?
>> DEEPAK TEWARI: Can you see me?
>> JUTTA CROLL: Okay. We can hear you and now we can also see you. Would you please introduce yourself and then we will have a look whether you're slides will work? Because there are some technical issues. It's the first day of the Internet Governance Forum and we are in a really busy, busy room and the tech anywhere ones are doing a great job but sometimes things go wrong. We will try.
>> DEEPAK TEWARI: It's all right. Even with no slides I can conduct myself. I'm Deepak Tewari. I run a company called Privately based out of Switzerland.
It's a technology company that for the last decade has been developing and deploying various technologies to keep minors safe online. And this includes technology that identifies threats and risks online.
but also technology of age assurance and age assurance is essentially the technology behind being able to identify whether someone is a child online.
and what hopefully I can get my slides up. I would have really wanted to show you how it's working and what it is like. But going back to the subject at hand, the Metaverse, my previous speaker was mentioning about avatars and the state of the Metaverse and virtual worlds in general. I'd like to throw in oh, wonderful, we have my slides. That's essentially where I'm calling in from. I'm sitting very close to that lake you see there. It's very beautiful and I'm very happy to be here.
if you can kindly go to the next slide.
This summarizes what Privately does. We have safeguarding and privacy enhancing technology but pertinent to today's discussion is the technology behind being able to identify whether a participant online is a child and especially in the Metaverse is a child.
and can we do this identification without actually identifying the person? So there is no personal data involved. And that's something I'm going to talk about.
but as I said, let's back up about the Metaverse. A couple of notes that I took, as of the latest statistics today, what roughly categorises as the Metaverse has about 600 monthly active users. 600 million. That's roughly 600 million monthly active users.
and according to some reports, that I have seen, particularly from Wired and another agency called Data Rate 51% of the users of the so called Metaverse are under the age of 13 and 84% are under the age of 18. You might ask me why that is the case. Because most of what we know as the Metaverse or virtual world is made of gaming environments, Roblox, Fortnite, so forth, which explains why so many of the participants are actually minors.
One more interesting piece of data, 31% of U.S. adults have never heard the term "Metaverse."
So that just tells you how much is the divide and essentially this virtual world we're talking about is actually full of children and this has been our experience as well developing technology, which takes me to the next slide, please.
Yes.
I don't know if you can get this. Yes. If you can get this working, then I'll let you see it and then comment after. There's a volume behind it, if you can see it.
Well, if the volume is not playing, I'll tell you. First of all, can we get the audio playing on that one?
>> JUTTA CROLL: It's playing but it's very faint.
>> DEEPAK TEWARI: So this is our technology at worked ins it's a VR headset and first an adult is speaking. When the VR headset detects it's the adult which is speaking it gives them access to adult content. As you saw, the advertisements were meant for adults.
but after that, if the audio was playing, you would now hear a child speaking. As soon as the technology detects it's a child which is speaking, the environment has changed. And the only content, and in this case it is advertising, the advertising that is shown to the participant is only related to children.
So what I wanted to illustrate out of this, you will see, this time it is the child. So it goes inside, you see all child related content.
the point we are trying to make here, well, there is a QR code here so you could actually watch this video on YouTube directly as well. The point we are trying to make here is that technology exists today from companies like Privately which have developed fully privacy preserving CRPD compliant solutions for detecting age of users.
As you can imagine, the detection of age of users is the cornerstone to keeping users safe online. This is the, the demo you just saw, we actually had it implemented for a very, very large social media network.
It was in trial. Of course, they decided not to pursue it, but for reasons best known to them.
but as you can imagine that the technology today exists and we are seeing more and more of these technologies which are privacy preserving that run in the background but are able to differentiate in real time just like we as adults do. As human beings, we say this is a child or this is an adult, and this can be running in the background, and in doing so it can ensure the service or virtual environment is able to deliver an age appropriate experience.
This is essentially my message. If you need to know more about the tech, that's the QR codes I've left behind. I'm very happy for you to either contact me or look at the codes. Here in the showroom you can even test the technologies. Thank you for your time. That's me. Over and out.
>> JUTTA CROLL: Thank you, Deepak, for giving us an insight how children or maybe all users might be better protected in virtual environments. We have now seven to eight minutes for questions, either from the floor here on site, or from our online participants. Do we have any people who want to come in with their questions? We can answer technology related questions or whether you have any further questions so far. Yes, we have a hand raised. You will need to take a microphone.
>> AUDIENCE: Hi. My name is Batina from Slovakia. I would like to ask you for the solution of the privacy for children. And is it not true the child's voice will change to adult voice?
>> JUTTA CROLL: Deepak, please, go ahead.
>> DEEPAK TEWARI: Yes, look, this is a real time. This is not happening on a one day basis. This will become a feature of the microphone so each time you're speaking the microphone can detect whether it's a child or adult. It's not like there's a one time check done and you're categorised as an adult or child. This is real time. This is continuous. And this is a feature of the device itself. Very much in a similar way we've now also created age aware cameras. If you go into shops and the camera looks at you, it detects if this is an underaged person they will not serve you alcohol or any restricted item. You have to think of it as being continuous and not done once forever, or for a long time.
>> JUTTA CROLL: I see she's nodding to the answer was accepted. It gives me the opportunity as you're speaking about age awareness that we will have another session on Wednesday at 10:30 on age and Internet of Things so probably put that in your schedule.
We have another question from the floor. The microphone.
>> AUDIENCE: What if the adult is using an AI to convert his voice from an adult to a child? To deceive the programme and get into the child's room?
>> DEEPAK TEWARI: You are right that is a threat because these days there are artificial programs and generative AI. You have to go into a little bit of a depth into this. There are two kinds of attacks. One is called a presentation attack. Which is you play a child's voice, for example. That's one way. The other one is you actually inject a child's voice into a programme. Which is a little bit more difficult. It's called an injection attack. Obviously, there are technologies to detect both and you would always argue it's a battle between how good the technology is to detect vis à vis the technology that is trying to fool or spoof, but an interesting thing here is the fact that this is continuous, at some point, you know, in some use cases we've seen the technologies being used to detect anomalies. So the same person is talking like a child, and he's talking like an adult to someone else. That produces an anomaly in the system and that could be used to track that there is a person who is probably malifide in that group. So there are ways and means to detect these things but like you rightly said there's a contest between technologies but these technologies exist to detect spoof voices.
>> JUTTA CROLL: Thank you, Deepak. One more question?
>> AUDIENCE: Thank you, Deepak. This is important work especially where governments are now contemplating whether children under 16 should have access to platforms or not. You talked about working with some big platforms but of course is there ownership of such kind of work at the most strategic level? Is there any ownership of this kind of work? Because this means less consumers of their content. And also, the way of these big tech platforms, you can just say "under 16" they don't give you a voice or anything that you can detect. Is there anymore work by Privately that you can talk about, about how to stop the exploitation of children by big tech platforms?
>> DEEPAK TEWARI: Thank you for your question. It is true we are seeing pushback from big tech platforms, and I hate to say this but it is because age assurance comes as a threat to their business model. If you imagine you're advertising to everyone and saying I'm only showing these ads to adults but part of users are kids, then if you just follow the money this is not in their interests. It is not in the interests of big tech to support age verification which is why you see stalling and fear and uncertainty and doubt being sown by pretty much all the big tech platforms. We're fighting a case in the Supreme Court in the U.S. right now, actually, so we're active everywhere, so the technology exists, but big tech does not want to do it directly. And it's always about liability. Who has the liability for this? As long as that is not settled and as long as there are no strict fines, unfortunately, big tech will be pushing this out a little bit. So we, as a small company, we are trying our best, challenging in court, doing thought leadership, going into games and showing it works and it's functional and we've certified all the technology publicly. But there's a business model contradiction with big tech, yeah, and that is a problem there.
>> JUTTA CROLL: Thank you, Deepak. We already have big tech also in the room so we will go to that question afterwards but first I would like to refer to a lady from the floor who referenced children's rights and their right to access to digital media, to digital communication, and that's the point when Sophie Pohel from the German Children's Funds some into play. If you have not heard about that, we have the document printed out here that you can take it and you will also find it on the website www.childrens rights.digital. (laughs.) thank you. Sophie, to you please.
>> SOPHIE POHLE: Thank you, Jutta. I hope everyone can hear me.
Welcome from cold and rainy Germany, Berlin. I'msophie Pohle. I'm from the German Children's Fund and we've been collaborating with the Digital Opportunities Foundation for years, including for the GC25, the General Comment number 25.
Now I'll set aside my role as an online moderator today of the session to give you a brief overview of the General Comment 25 as a framework for our discussion.
Let's start. The UN committee on the rights of the child published the General Comment 25 in 2015. So three and a half years ago. And it's focussing on children's rights in the digital environment. This document guides states parties on implements the convention on the right of the child in digital contexts. It was developed with input from governments, experts, and also from children. And as a practical framework to ensure comprehensive and effective measures are in place.
Our session today explores the Metaverse, a topic which is not directly named in General Comment number 25, however the GC highlights that digital innovations significantly influence children's lives and rights, even when they do not directly engage with the Internet. And by ensuring meaningful, safe, and equitable access to digital technologies, GC 25 aims to empower children to fully realize their civil, political, cultural, economic and social rights, in an evolving digital landscape.
Let me briefly introduce the four general principles of the General Comment 25, starting with nondiscrimination. Which emphasizes ensuring equal access and protection for all children.
Second, we have the best interests of the child. Prioritizing children's well being in the design, regulation, and governance of digital environments, including of course digital worlds and the Metaverse. Thirdly we have the right to life and development, that means to ensure digital spaces support children's holistic growth and development.
and last but not least we have the principle of represent for the views of the child, very important, which means to consider children's perspectives in digital polymaking and platform design.
What are the central statements or key emphasis of the GC25? The General Comment recognises children's evolving capacities and how user behaviour varies with age. It highlights opportunities and different levels of protection needs based on age. And also, stresses the responsibility of platforms to offer age appropriate services for children.
So, the GC25 calls on the States Parties to support children's rights to access information and education using digital technologies. It also urges to ensure that digital technologies enable children's participation at local, national, and international levels.
and highlights the importance of child centric design, which means to integrate children's rights into the development of digital environments. For example, with age appropriate features, content moderation, easy accessibility reporting or blocking functions, and so on.
on one hand we have those opportunities and participation calls, and on the other hand we also have the GC 25 calling the States Parties to identify risks and threats to children and cooperating that respecters.
and it also calls for solid regulatory frameworks to establish peer standards and international roles on one hand and to implement legal and administrative measures to protect children from digital violence, exploitation, and abuse.
Also, GC 25 encourages co operation among stakeholders like governments, industry and Civil Society, to tackle dynamic challenges.
and last but not least it aims to promote digital literacy and safety for children, parents, and educators to ensure informed participation.
in my last minute I'd like to give a very brief insight into children's perspectives that were collected in the consultation process of GC 25. The principle or the principles I laid out in GC 25 are directly informed by the needs and expectations children have voiced globally during the consultation process. I think it was more than 700 children globally that were consulted.
and yeah, to conclude, I brought some key insights on very general level from young people on how we can better support them in the digital world, before Maryem after will take us through the children's perspectives in way more detail.
So, what do children want? They want equitable reliable access to digital technology and connectivity. They also wish for age appropriate content and safer digital experiences where platforms protect them against harassment, against discrimination, against aggression, and rather enable them to participate and express themselves freely.
Children themselves demand greater privacy and transparency about data collection practices.
and they want also more digital literacy education, also for their parents, by the way.
and they also want their recognition of their right to play and leisure, which is also crucial when we talk about the Metaverse.
So much for now. I think my time is up. That was on a very general level. Yeah. Thanks a lot for your attention and I'm happy to answer questions.
>> JUTTA CROLL: Thank you, Sophie. Thank you so much.
>> SOPHIE POHLE: Welcome.
>> JUTTA CROLL: Can you hear me? Okay. It's working. Thank you so much for already touching upon digital literacy because we will also come to that point to discuss how the Metaverse will open up opportunities for training of digital literacy for children as well as for their parents and other adults. First we go to Maryem, who is a very young person and holds a bachelor's degree and a master's degree, so I'm very impressed with you. But please introduce yourself to the panel. Your slide is on the screen.
>> MARY UDUMA:
>> MARYEM LHAJOUI: Thank you, Sophie. I'm very happy to see so many people in the room, actually. Thank you for being here. My name is Maryem and I'm here on behalf of Digital Child Rights foundation. We have a youth platform, a youth expertise centre, based in the Netherlands but we focus internationally. What we do at Digital Child Rights, our main goals are to really have the importance of Digital Child Rights and to really have them included, children's opinions, in what is safe for them, and to include their opinions.
We do this through different ways, to recreate playful tools for children. We also gather all their not all but most of their information on how they view things, through our child rights panel. And then for youth, we have many youth ambassadors in the Netherlands but also in different other countries.
So we really want to give you the platform to connect through connection challenges.
So Sophie, thank you again. Very clear overview of the General Comment 25. You might see some overlap here. So at Digital Child Rights, these are our themes, and they're based off the General Comment 25.
So while the Metaverse that we're talking about offers great opportunities, for children, for youth, to actively participate, there's many chances in ways we didn't know before. Right? At the same time, we also have to remain critical and acknowledge that there's challenges. Right?
We have to prioritize safety and privacy and fairness in the best interests of the children and that's what we really focus on. When it comes to privacy, as already mentioned before, who is collecting my data? Where is my data collection going?
Safety as also addressed very clearly by my colleague Deepak. When it comes to age verification and I pretend to be way older than I am or can I pretend to be way younger than I actually am? We know the dangers that come from it, but what is actual action that should be taken? And how do children even view this? Do they know it's possible to meet other people online that might not have the best interests, unfortunately?
So this is where the rules are very important. Right? So there are rules, also, as outlined in the General Comment 25, like Sophie explained.
So how do they view it? And what are we going to do about it?
and then, also, really important, is this at Digital Child Rights and in many others I also heard it is digital inclusion. Right? Can everyone participate? We're based in the Netherlands. We speak with a lot of youth there but it's different in other countries. Can everyone participate? So at Digital Child Rights we actually conducted some interviews at Gaza Camp in Jordan this year where we spoke to Palestinian refugees in Jordan about access to Internet, access to Metaverse, what are opportunities for them in the Metaverse, what's important to them? So this digital inclusion is very important especially when it's so closely interlinked with education.
So when education plays a very great role.
and then also, very important, rewrite it as opinion, is can everyone give their opinion online? Are children free to say what they want?
and also, be able to do that within the frameworks of nondiscriminatory as outlined in GC 25. So it's very important that it's equal and that they can give their opinion, which we also really we gather a lot of their opinion.
and, so, yeah, at Digital Child Rights we're always looking to connect with other youth platforms. So I do invite all of you that are sitting here to talk to me afterwards about what are opportunities to enhance this, yeah, to really strengthen the voice of youth and children, not only in this room, or not only in the Netherlands or Europe. That's why we are here. I would love to meet so many of you.
Yeah, the question for us, what can we do in the best interests of children and how do we really include them in it?
So I hope to talk to many of you after this.
>> JUTTA CROLL: Yeah, we also have now about ten minutes for you to take questions, as well as for Sophie. Do we have any questions here in the room? Or otherwise in the online chat?
Anyone who wants to raise their hand? Yes? Okay. Emma, please.
>> EMMA DAY: Thank you for the presentation. (Audio going in and out).
>> LHAJOUI MARYEM: I think the question was for me, right? I think it's important to think about how much knowledge is in the room when talking about these issues. As it comes to children, we create playful roles, like with a card game, and my colleague is holding it. Perfect. It's a card game. It's a playful way but really also to talk about what's connected to, can everybody participate? What does that even mean? And also, for example, with safety, what does safety even mean? Then it's also, yeah, it's like we also wrote it here. You can take a look at it. So that you're aware that you can ask for help when you're in danger and that there is many opportunities for you, chances. So there's a card game. We also offer different kinds of workshops. There's also a workshop with masks where children make their own mask. This also really portrays, right, this age verification, and also really as outlined in the previous presentation by Michael, yeah, about the avatars. Who am I? Even in the online world. Is it also a different person than I am in the real world? We really encourage them to think about this before they give us their opinions. And then, still, there's so much to learn. Through these playful tools, and also through these connection challenges that they do with other platforms, we encourage them to learn more and then we can learn more from them and then they can also give better opinions. I hope that answered your question, Emma.
>> JUTTA CROLL: We have another question here and then I would also turn to Sophie to tell a little bit, if you're able to do so, about the children's participation for our General Comment 25 which pretty much refers to what you have said. So we'll take those two questions. Yes.
>> SPEAKER: I'm the youth ambassador from Hong Kong from the Founderration.
I want to ask a question about a lot of questions on the Metaverse. How can you prevent childrens from having access to these transactions and being cut off from their reality?
>> JUTTA CROLL: Okay. It's a question regarding addiction to the Metaverse. You're talking about addiction to the Metaverse, right?
>> AUDIENCE: Yes.
>> JUTTA CROLL: Okay. I'm not sure whether this question should go to the panel or whether we can go to it afterwards. To whom would you want to post the question? To Maryem. Okay.
>> AUDIENCE: Just now she said there are lots of playful tools. Yeah.
>> MARYEM: Can you hear me, Winston?
>> AUDIENCE: My name is Winston.
>> MARYEM: Very good that you're here, right? Very nice. Thank you for your very great question, actually. There's many things in the online world and I must admit I also get maybe I get a bit addicted sometimes or a bit lost, you know, when you're scrolling? What can we do? This is also a question for you and me at the same time. What can we do to help each other and help our friends that are the same age to really not get lost in that? So what we at Digital Child Rights do is we so like I said in the beginning, the Metaverse, yeah, we can get a bit lost in it, but it's also you know we're also going with the time. It's also a place where there's many great opportunities as long as we know how to handle it. Right? So we really try to make young people aware of the dangers. Yeah? The challenges. And also the nice things.
So we really try to tell you, if anything happens, then there's always some kind of help offered and that you are really aware that you have the right to be safe. Right? Because Winston if I tell you, yeah, you should not spend any time on your phone. That's a bit crazy, right? Maybe. I don't know. If you want. But we need to regulate it. We can't spend too much time but it's also interlinked. You can learn a lot. It can help you with your education. So to answer your question sorry.
>> JUTTA CROLL: Time out.
>> MARYEM: We can talk later, Winston.
>> AUDIENCE: I wanted to ask Maryem and Sophie both. She talked about literary and you're talking about the children, how do you educate it when it comes to children with Metaverse because it's crucial to have them on board when we talk about children's engagement with the online spaces.
>> JUTTA CROLL: I will give that question to Sophie, but only a short answer, Sophie, please, because we're running out of time.
>> SOPHIE POHLE: I didn't really hear it well. What was the question about how do we educate parents?
>> JUTTA CROLL: Yes, how do you educate parents when it comes to digital literacy for their children.
>> SOPHIE POHLE: Yeah, that's a question we discuss in Germany a lot for a few years and I think the key is to involve the schools more, to reach every child because every child goes to school, and there we have the parents, too. So that's key, I think. We also need to think about how can we reach those parents who are not already sensitive to this topic, because often we see that parents inform themselves when they see, ah, there's a problem. But we have a lot of parents also that do not have access to this information. Who do not have the resources. And we do need to think about more how to involve them and, yeah, to get to them directly more. It's a very complex question, to be honest. Very difficult for a short answer.
>> JUTTA CROLL: Okay. We will follow up with that, as well.
>> SOPHIE POHLE: Yeah.
>> JUTTA CROLL: One last question. Very short.
>> AUDIENCE: It's not a question. It's a proposal to the Digital Child Rights Foundation. I'm Salvatone from Bangladesh. We are working for teenagers in Bangladesh. In Bangladesh we have a help line, 13 to 19, it's a cyber teens help line. Any teenager experiencing cyber bullying or like that, we can help.
I would like to work with Digital Child Rights foundation. We need mentors. Thank you.
>> JUTTA CROLL: Thank you so much. So we leave it at that time because we have already our next speaker on my left side and Hazel is also in the room in the Zoom room but we will start with Emma Day, a human rights lawyer, and an artificial intelligence ethics specialist. Also, she's the founder of a company focussing on human rights and agencies.
Emma, to you.
Now we talk about regulations. When we set up the proposal, the Australian law keeping the children under the age of 16 off was not the law, but I'm pretty sure you will be able to address.
>> EMMA DAY: (Her microphone is not able to be heard)...when we specialize in human rights and technology and I'm going to talk to you about...(microphone going in and out)...applies to the Metaverse and maybe where some of the gaps might be. The Metaverse is an evolving space. Some of it we're talking about is a little bit futuristic and may not actually exist yet but we're talking about a kind of virtual space where unprecedented amounts of data is collected from child users and adult users as well.
Already today, we have apps and we have websites which collect lots of data about users, about where they're going online, how they're navigating their apps. But in the Metaverse, companies can collect much larger volumes of data and much more sensitive data, things like users' physiological responses, their movements, and potentially their brainwave patterns, even. Which can give companies potentially deeper insight into their thoughts and behaviours and this can be used to target people for marketing and track for commercial surveillance or even shared with governments for government surveillance. So it takes us to another level when thinking about data and governance.
Then we have people's behaviour in the Metaverse, both children and adults. If someone said something in the Metaverse, is it the same as posting content online? What rule should apply there? What about content they post online? If I use an avatar online, am I responsible for that? And if my avatar is wearing some kind of haptic device that makes me feel the touch to them, how do we deal with that? There are some questions that we don't really have answers to yet from regulators.
but there are some regulations, as we know, that apply. For example, the GDPR, still applies to the Metaverse. This is the European law about data protection and there are many others around the world that apply to data protection specifically.
That means the UK age appropriate code...(microphone going in and out)...which is guidance on how the GDP applies to children would also apply.
but it may be difficult in data protection law to determine who is the data controller and who is the data processor. The data controller is the entity responsible for deciding how the data is going to be used for what purposes and they, then, are ultimately the most liable and accountable for that. But if you have lots of different actors in the Metaverse space who are sharing data between them, it may become quite confusing.
Then the data controller, before they process data, particularly from children, they should be telling them, giving them privacy notices. How many privacy notices can you have in different parts of the Metaverse? Say you're in an experience, maybe a child is walking along a street in a virtual town and stop in front of a bakery and they're looking in the bakery window and maybe one of the bakeries advertising see they're hungry and they can target them for food advertising because they can see they're hungry. So it's a different way than we use advertising and websites currently.
and then of course there's the question of which users are children, and sometimes not just which children are users or adults, but precisely how hold is that child for data privacy purposes.
and then there's also online safety. Say we have online safety acts, like in Australia that prevents children under 16 from using social media which maybe they would also apply to the Metaverse. But then in the U.S, there's Section 230 of the communications decency act which you may have heard of which really provides online platforms from immunity from third party content, from liability for that.
So that may change in the U.S. as the Metaverse develops. This is a very political topic in the U.S. We don't really know what direction that will go in.
So we have a very diverse global regulatory framework and in fact not a lot of very specific regulations and not a lot of enforcement currently.
but I think the main takeaway for me would be the common thread globally is human rights and children's rights. And what we do have is the UN Guiding Principle on human rights endorsed by the human rights council in 2011, that are to prevent human rights harms connected to business activity. And they're aimed at both states and business enterprises. They call on states to put in measures to protect children's rights including in the digital environment and also for businesses to respect children's rights and this includes tech companies. When we think about tech companies we need to think about, also, the safety tech and the age assurance tech. These are also tech companies.
So both the platform, who is providing the Metaverse, and also any technologies used within that platform, need to carry out risk assessments where they look at all of the different rights that could be impacted for both children and adults, and in a mixed audience you need to look at both children and adults and make sure that all of the stakeholders are engaged with and consulted with, and so that something that is introduced to protect children doesn't then have an adverse impact on other rights. So we need to make sure that all of the rights online are protected. And the UN Guiding Principles provide a methodology for stakeholder engagement, for risk assessment, and then for identifying how to mitigate those risks in accordance with children's rights and human rights. If tech companies carry out their due diligence and doing their child rights assessments then they should be in good shape to comply with regulations that may be coming down the line. I'll leave it there.
>> JUTTA CROLL: Thank you very much, Emma, for giving us insight on the regulations so far and we know it doesn't address the Metaverse at this time but we know it will go that way. And then I will hand it over to Deepali, who is coming from Meta. I don't think you'll be able to react to all the things that have already been said about service providers like Meta is one, but I would like to refer to something that Michael said at the beginning, that social media platforms are already virtual worlds. And they are coming on us, part of our own behaviour. That is something your company can working on. Can you explain a little bit your position?
>> DEEPALI LIBERHAN: I think it will be useful to talk a little bit about what the objective is here. The objective that we have is to make sure we have safe and age appropriate experiences for young people on our platform. And irrespective of regulation we've been working to do that, across the apps, whether it's Facebook, Instagram, or VR or AR products we've adopted. We've adopted a best for the child framework that our teams use to develop products and features for young people and while I won't go into all those considerations I think two important considerations I do want to talk about, the first is and it was lovely to hear from you Maryem, is exactly engagement with young people and families who are using our products.
and it's really important to engage not just teens but also parents, and we've done that where we have in the last couple of years we've rolled out parent supervision tools across the products, including Meta Quest across the Metaverse. This is important exactly to the point that parents don't necessarily know how to talk to their young kids about the Metaverse and virtual reality. We had these consultations and engagement with parents and teens who are sitting in the same room to help to be able to design parent supervision tools and we designed those tools in a way that respects the privacy as well as promotes autonomy for young people but also gives parents some amount of oversight and insight on the teens' activities especially in the VR space.
for example, you can see how much time your teen is spending in VR. You can set daily limits. You can schedule breaks. You can also approve and disallow apps. These are some things built into our parent supervision tools and I think it's really important, along with these tools, we work to make resources available in our digital hub so that parents can get guidance on how to talk to their kids about virtual reality and about AR and about how to stay safe online.
This is one consideration that we have.
the second is building safe experiences. Irrespective of whether you have parental supervision or not, and I think parental supervision is very important, but the other thing is what are we doing to make sure that children who are using the Metaverse products that we offer are safe, is essentially a set of built in protections we have. For example, 13 17 year olds have default private profiles. If you have used Instagram or Meta Quest you know private profile is very different than public profile. You can allow people to follow you. Not everybody can see what you're doing.
We also have a personal boundary by default. I don't know if you have used it. It's a personal bubble that is around you, that's on by default, to prevent against unwanted interactions when you're in an avatar space and engaging with other avatars in Meta Quest.
We also limited interactions with other adults, the teens, that are not connected on another platform. You want to use the Metaverse to connect with your aunt who is in another country but you don't necessarily want your teen to engage with strangers. We have restrictions in place to limit those interactions.
the other thing is, we have in a world of having really clear policies. All the apps on the Meta Store, for example, have an age and content rating like you have a film rating and teens are not even going to download apps which are inappropriate for their particular age.
So and there's a lot more than we are doing in terms of making sure that we are building those safe and age appropriate experiences on our platform. The other thing that I do want to point out is that, I know somebody said earlier, that a majority of teens online right now, are using it for gaming and Internet. But at Meta we also feel like the potential for the Metaverse in immersive learning is really immense. I remember I was in school and we used to read about interesting places like the pyramids of Giza or a Eiffel Tower. You can have young people actually visiting those worlds in the virtual space and people from different economic backgrounds be able to study together and I think that kind of innovation or any kind of legislative regime, it's also important to protect and promote that kind of innovation and Meta is actually we set aside 150 million fund just for immersive learning. And there are many projects that we have that I have talk about.
I don't know how much more time that I
>> JUTTA CROLL: Time is up. If you want to take more questions? I'm looking around in the room. Probably, there we have a question. And we have another question over there.
>> AUDIENCE: This question is for you. Can you hear me now? We heard about Privately, I believe, the company was. And we know there's other technology out there to verify children's age on platform. What is Meta doing to verify age verification on platforms?
>> DEEPALI LIBERHAN: We have a number of ways we use to assure age on our platforms. I think one of the important things to understand is that it's a fairly it's a fairly complicated area because we want to make sure that we're balancing different principles and a couple of principles I'll talk about. The first is data minimisization. We talked about years ago where we said it would be really easy to people digital IDs, but you don't want to do, for multiple reasons. Very least, no one would like us to collect data. So what is an effective way to ensure age which balances, you know, data minimization with effectiveness and with proportionality? We have a number of ways that we do age assurance. For example, we have people trained on our platforms, for example, Facebook and Instagram, to identify underage users and they're able to remove these underage users.
We also have invested in proactive technology and that proactive technology looks at certain kinds of signals. For example, the kind of accounts your following or posts your posting. Those are ways we've developed and it's a proactive technology that keeps on developing that we're using to remove underage accounts. I don't know if you've heard about UOT (phonetic) but it's a third party organisation like Privately that has come up in a privacy protected way to be able to identify age range, just based on your selfie. What we have done is, we've and you will see it if you use Instagram that if we find there's an age, if I'm a young person who is 12 years of age, that person is not allowed on the platform. But for example, if a 15 year old wants to change their age to 18 year old one of the options to verify that person's age is you can give your ID. So you can UT which is you take a video selfie which is kept for a period of time and then deleted to be able to identify age.
>> JUTTA CROLL: Time out.
>> DEEPALI LIBERHAN: So there are a variety of ways that we are working to ensure age.
>> JUTTA CROLL: We will be around later maybe to explain to the OT.
>> AUDIENCE: Just briefly to pick up on the age verification, you talked about age verification there are plenty of available options to do age justification or verification without collecting any data. None of the social media platforms employ those properly. I think with Meta, it's clear, whether it's Meta or any competitors, none are really doing that seriously. So the limits you talked about, about restricting access by age to different content, are lovely. But if you don't have respective age verification, then they're also meaningless sadly. They could do far better and at the moment they do the bare minimum and there's a lot more you can do.
>> DEEPALI LIBERHAN: Just to respond to your question, and I'm available after this if you want to have a broader discussion. We are working on we are working on exploring options to do age assurance in a more effective way. UT is one such organisation that we work with that's been recognized by the German regulators, so clearly and we're working on ways on how to use it at more points of friction on our platforms. The other thing I would say, it's also a broader ecosystem approach. One of the legislative solutions we've talked about that we think it's a fine balance between data minimization and also making it really simple for parents is to have age verification at the app level or OS level, or age assurance. Which allows parents to oversee approval of their apps at one particular place and also minimizes the collection of data at one place, and a lot of third parties have also talked about how this is one of the ways where we can approach this in a broader ecosystem perspective. The wider discussions we have at the same time we're working to assure we are more effective ways to do age assurance on our platforms.
We've already recently launched teen accounts in certain countries and we will roll out to the rest of the world. Teen accounts on Instagram will be rolled into a protective setting which has certain in built protections and we're also investing in pre active technology to give us the right signals to make sure the teens are not lying about their age. Because as you know, they will lie about their age.
>> JUTTA CROLL: That they are doing already. (laughs.) Thank you so much. We're a bit under time pressure. That's why I'm going now to the next the last block of our session, and that is coming to what can Internet Governance do to support to keep children's safe in a virtual and digital environment? I'm handing over to my colleague Torsten who will give us an overview of the digital compact and what rights does it give to children?
>> TORSTEN KRAUSE: Thank you very much, Jutta, regarding and listening to all these thoughts. I want to have a closer look to the Global Digital Compact which was adopted this year, part of the common agenda at the United Nations, and shapes the digital environment. It's not a legally binding document like the Convention on the Rights of the Child, but it further expresses the hopes for the digital environment.
What does it mean? C DC has several objectives. Some of them, it says to close all the digital divides and accelerate process for development goals, to expand inclusion and reap the benefits for all, and foster a digital space that protects and promotes human rights, and children's rights are part of human rights, as you all know.
and they declare that to create safe, secure, and trustworthy emerging technologies, including AI, with the human centric approach and effective human oversight.
So all that is done should be controlled, in the end, by humans.
When we have a closer look to what is mentioned about children's rights and GDC, then I'm not aware of how you are following the progress in the past, then in the first draft there was no children's right directly expressed. And several organisations do their stances and give their perspectives and comments to put children's rights in this compact and in the end there are several points where we can touch on. And the biggest area or field of child rights is around protection rights.
So the States are asked to strengthen their legal systems and policy frameworks to protect the rights of the child in the digital space. So every one of you can ask their governments how they do it, how they put in place their policy frameworks. And the states also should develop and implement national online child safety policies and standards, and they call on digital technologies companies and developers to respect international human rights and principles.
We heard some of that in this session.
Are the companies, developers, and social media platforms, all of them, should respect human rights online. And to implement measures to mitigate and prevent abuses, includes also with effective remedy procedures, as also Sophie mentioned in line with the General Comment 25.
and a broader part, it comes to counter all forms of violence, including sexual and gender based violence, hate speech. Discrimination, misinformation, disinformation, but also cyber bullying and child sexual exploitation and abuse.
Therefore, it is needed to monitor and counter childhood sexual exploitation and abuse and to implement accessible reporting mechanism for the users and, yeah, for the users.
When we have a closer look to what's about provisions with regard to children rights, then it's mentioned in the GDC that the States are responsible to connect all schools to the Internet.
and it's referred to the initiative of the ITU and the Children's Fund in this way.
with regard to digital literacy it is said that it is necessary that children and all users, of course, should meaningfully and securely use the Internet safely and navigate the space and therefore digital skills and life long access to digital learning opportunities are very important.
So the state should establish and support digital schools, strategies, adapt teacher training programs, and also adult training programs. With regard to your question, Winston.
So that they have a mind that it's not as much as necessary to teach the children, the users, but also the responsible person around them so they can support and protect them.
When we have a look to participation, then it would be a very short. It's mentioned that meaningful participation of all stakeholders is required. But it's not said that children be part of that, but in regard of that, it's so important that children and young people also take part at the Internet Governance Forum at the multistakeholder level to bring in their voices and perspective so that they come in, in this meaningful participation process, of all stakeholders.
That's what is a short overview of the GDC and I hope it was meaningful.
>> JUTTA CROLL: Yes, thank you so much, Torsten. We also would like to remind everybody, if you haven't had a lot at the Global Digital Compact please do so. It's open for giving your consent for endorsement of individuals as well as organisations, companies, and so on.
So the more people endorse, the Global Digital Compact, the more gravity we'll get with all these recommendations.
So eventually, Hazel, thank you for your patience waiting in the Zoom room. We're happy to have you here to give us the perspective of children with a special focus on the Asia Pacific area. Over to you.
>> HAZEL BITAÑA: Thank you, Jutta.
I would say to keep children safe and help them reap the benefits of their virtual environments, Internet Governance and child rights principles, which is found in GC25 as Sophie presented, the best interests of the child, taking into consideration children's evolving capacities, nondiscrimination, and inclusion for children with all their rights.
These are the underlying messages from children.
Such as when my organisation, child rights coalition Asia, held our meeting in August in Thailand where we had 35 child delegates representing their own national or child led groups based in 16 countries in Asia. Although we focused our discussions mainly on emerging practices, and generative AI in line with civil and political rights, the recommendations gathered from this platform could be applied in emerging technologies like the Metaverse.
Summarizing their inputs, one of their recommendations is an approach that recognises children as rights holders and not just passive recipients of care and protection. Children want to be involved and empowered to be in discussions as part of the policy discussion processes. Having child friendly of the terms of conditions and privacy policies that they agree to. This is another one of their key recommendations.
Involving children in the decision making processes allows us to have ablistic perspective. We get to learn how children are leveraging these emerging technologies to create positive change, which are not usually highlighted in discussions. When we talk to children about generative AI they said that it is beneficial for their advocacy work as child rights advocates for child human rights defenders, for their recreation and right to leisure and childhood activities and for their education.
and I think these are important in the Metaverse as well. By getting children's perspectives you also get to see the impact of these emerging technologies on a number of children's rights. There are already evidences on sexual exploitation and abuse on the Metaverse and from our regional children's meeting children were concerned about the generative AI in connection with climate change. This could be a concern as well when we talk about the Metaverse.
Another concern is in relation to the right of privacy and informed consent, especially considering the unique privacy and data protection issues posed by generative AI and Metaverse, as mentioned by a number of speakers today.
and echoing one of the points from Maryem earlier, a key approach to Internet Governance is ensuring nondiscrimination and inclusion in a number of aspects. One, the virtual reality hardware or devices or the Internet speed band width required, Metaverse is widening the digital device. And freedom of expression, including gender expression, Metaverse can provide the platform to enjoy this freedom, as Michael expounded on earlier.
but at the same time, without safe guarding these positive potentials could instead deepen the discrimination.
Culturally diversity should also be taken into consideration to keep children safe in virtual environments.
in the Metaverse, harmful body language, gestures, and nonverbal communication are the additional aspects that should be included in the list of violations that children can submit in their reporting mechanism in the Metaverse. This brings me to any next point regarding the importance of how a child can be reporting mechanisms and the effective remedies in the Metaverse. With diversity of language especially in the Asian region which is always feeling left behind because our languages are not always the popular ones in the digital environment, what more now in body language and gestures are included in the context of the Metaverse? So it's important to have specialized regional or local guidelines is important.
>> JUTTA CROLL: Hazel, I need to take out your time because we have now two interventions from the online petitions and I want to give them also a bit of time. Maybe they have questions for you or for anyone else. Thank you so much.
Sophie, will you hand over to the online participants? Or will you read out their questions from the chat?
>> SOPHIE POHLE: I had one raised hand when we were talking about children's rights and the children's rights block. It was from Angou. I'm not sure if they're still with us because I cannot see it I cannot see them in the participants list anymore. So maybe a participant with a question is already gone. If not, feel free to raise your hand again. But I think they're gone.
and I have another question. From Marie eve Dador. I can read it out. She has a question for Meta speaker. The OECD report highlights the impact of virtual reality for children's development. What prompted Meta to lower the age requirement for the Quest? And how does the company address the potential risks to children?
>> JUTTA CROLL: Over to you, please.
>> SOPHIE POHLE: Thanks for the question.
>> DEEPALI LIBERHAN: As I said before, we worked with parents and young people as part of our best interests of the child framework. A lot of parents themselves want to be able to have their kids experience or start experiencing Meta products in a very protected way. So we've done two things. The first is, you know, 13 to 17 year olds which are allowed on our platform which have certain default protections, as well as the ability to have parent supervision. But less than that age, for less than that age, those accounts are actually managed by the parents and what we've heard from parents is that and we have we did this similarly for Facebook, as well. We have Messenger for Kids which is managed by parents which gives an opportunity for parents to really effectively manage their child's presence online and be able to deliver all the benefits of that in a very, very protected way.
So we've done this in consultation with parents as well as with experts.
>> JUTTA CROLL: Thank you, Deepali. We now have Ansu in the room, not online. So I can give you one minute.
>> AUDIENCE: So the question is, have you considered anyone can answer this. Have you considered the design principles for a governance framework? Because I'm a researcher in this area. Thank you.
>> JUTTA CROLL: So you're asking for the design principles of regulation?
>> AUDIENCE: Yeah, design principles when developing a governance framework. Has anyone considered that? If so, what?
>> JUTTA CROLL: Okay. Will you be able to answer that? So I think she's talking about privacy by design, safety by design? Safety by design, child rights by design? Do you just mean in general?
>> EMMA DAY: There are other safety by design and privacy by design regulations but I think also the theme of the Internet Governance Forum is multistakeholder governance, right? And one of the things, that's been working on with UNICEF, actually, is related to this type of work. What we've been looking at, as part of this, is the use of regulatory sandboxes. There's an organisation called Data Sphere Initiative. They've been look at trying to make a multistakeholder sandbox, bringing in regularities but also Civil Society, who are often the missing piece, to look at these frontiers of technologies and trying to bring in regulators.
We have a panel at 4:30 here today local time if you want to hear about that, where we're looking at governance in fin tech and other tech and we'll look at the. Thank you.
>> JUTTA CROLL: Yes. Thank you so much. Thanks to all of you who have been in the room or have taken part online. I have now only three minutes to wrap up but I will try to do my very best.
I do think as we said at the beginning the virtual worlds are already inhabited by children. We heard from Deepak that 50% of the users are under the age of 13 and there are age restrictions and that is mostly due to the fact that we find their use in the gaming area.
but nonetheless we also heard that social media platforms are already virtual worlds and that is because we are inhabitants of these social media environments and we are giving them our data, our profiles. So several information about our identity.
That led us to the question of data minimization which is kind of a controversy to knowing the age of the users, either by age estimation or by age verification which is always going hand in hand with gathering, collecting data, about the users. That was also something that Meta representative Deepali told us that they are trying to balance that and trying to minimise the data.
We have some regulations especially that GDPR was mentioned which is applicable only in Europe but still it has been copied to several areas of the world, which give us kind of an orientation what would be the principle of data minimization and how it could also be applied to the Metaverse, but then we also learned that there are larger amounts of data and more sensitive data collected. So that would be the reason why we need another level of data governance for the Metaverse.
Considering that we already have a gap of regulation, when it comes to virtual reality.
Eventually, I would like to go back to the children's rights. We have learned that we have the area of protection rights, provision rights, and participation rights. When we're talking about virtual environments we always have a bit more focus on children's safety but still we have seen that it's really a huge opportunity to build on the evolving capacities of children to provide them with the education, the peer education in virtual environments, and that will also ensure that they have their right to participation.
Finally, we heard that children want to be empowered and involved and that they understand generative AI as an instrument for children's advocacy.
So I think that is a very important and future oriented message that we got at the end of our session. And that is the message that I will conclude with. Let's focus on the opportunities that the Metaverse will provide children with, without neglecting their right to be protected.
Thank you so much.