The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
SPEAKER: I will give you headphones. So we can hear each other.
PANELIST: Thank you.
SPEAKER: We have headphones?
GIACOMO MAZZONE: Okay. Can you hear me? Through your headphones? You need to put your earphones. If not, you will not listen to what happens in the room.
SPEAKER: I can hear from the computer.
GIACOMO MAZZONE: Okay. Can you put your earphones? Thank you very much. We'll start in a few seconds.
SPEAKER: Test.
GIACOMO MAZZONE: I see Eric, Delphine, Benjamin, Viola, you are all connected. You can originate your slides. I would say that we will start. Can you hear me? Yes. Okay. Thank you. Welcome to the session. This session is about the year of election, the year 25. It's been a year of elections. The last one was a few days ago in Romania. There's been all over the world more than two billion people voted this year. Of course, as now is possible in most of the elections, there's been interference coming from the social media and the Internet. We'll now in the session try to discuss what happened in the various experience. We cherry pick some European experience, the U.S. experience with Benjamin that's with us, and the African experience with Philile that's with us. You cannot see. But he's there somewhere. What is the assumption of the session? The assumption of the session is there's been measuring that's been taken in certain parts of the world. In Europe, for instance. But also in other parts of the world. Trying to impact the elections on the regular processes. The information today was to see how the regulation was worked, if it was not work, which was the problems they encountered? Et cetera. Et cetera. So the European model is the probably the most complex and more articulate. I would like to ask Alberto Rabbachin from the European Commission to start and explain to us how the model works. If he has slides, we can originate the slides. Before I can give you floor to you, I would like to thank Paula who is moderating with me and Eric. That's a difficult ask to make the relation about what would be connected today in order to make a report from the session. Thank you. Alberto, can you take the floor? Can you start with your slides?
ALBERTO RABBACHIN: Yes, thank you, Giacomo. It is a pleasure to be here. Let me share a few slides that I have prepared for the session.
GIACOMO MAZZONE: We don't see the slides.
ALBERTO RABBACHIN: They should be coming.
GIACOMO MAZZONE: Okay. Something happen? Yes.
ALBERTO RABBACHIN: Now I'll make it full screen.
GIACOMO MAZZONE: Perfect.
ALBERTO RABBACHIN: Okay. Very good. As I said, thank you very much for inviting me to present. What is the framework that the commission has in order the European Commission has in order to safeguard the electoral process? As Giacomo said, it is quite complex mechanism which is composed by several elements. I'm going to give you an overview of the elements that are very much related to the work that I do. Of course, there are also other activities made by other parts of the commission. So it was a year of election. Particularly in Europe, we had the European election and several parliament elections and national level. It was an intense year. As Giacomo said and probably you are all aware, the new digital ecosystem poses new challenges also to democracy and also in particular to the electoral process. That's why the commissioners adopted quite a complex ecosystem of rules and initiatives. But first of all, it is also important to acknowledge that this problem the problem of the information and the election entitled is a problem that many Europeans citizens, 38% see as, you know, a very serious threat and a very serious problem. 38% of Europeans consider the information is a serious threat to democracy. That's also why we are doing what we are doing. It is not just an initiative. It is a need coming from the citizens. What we which are the pillars of this framework? Well, there's, of course, a new regulation. It is the first global standard for preventing illegal and harmful activity online. So this covers illegal activities. We'll see later how. We have the regulatory regime which was an agreement at the national level taken with other radical actors to limit the impact of the information. We have then a bottom up initiative which is the European digital media. That's finalised by the European Commission that pose together actors that find the information on the daily basis. We were talking about researchers, fact checker, and other stakeholders. The new entry which is not it is an act that has been voted and it is not yet fully implemented is the political advertisement and regulation that regulates the advertisement only. So the Digital Services Act. As I said, the Digital Services Act targets for this we have a large online platform. There's more than 45 million users in Europe. This obligation obliges very large search engine to put in place a risk assessment and risk mitigation measure to take on specific risks. How is this related to election, the Digital Services Act? Of course, as we said at the beginning, as Giacomo introduced at the beginning of the session, the social media and in general online platform, also poses a risk to the course of the electoral process. Therefore the risk needs to be addressed and mitigation measurement needs to be implemented by very large online platform and very large online service. In preparation of European election and also of the election, the commission has run a roundtable which is put together the national authority and major online platforms. Also the key stakeholders that altogether discuss the risk related to the electoral process. The commission is also the head of the European election that issues electoral guidelines on the mitigation and systemic risk for elections. So this is what was done basically in 2024. This guidelines are very important. They include recommendation for very large platform. This recommendation, you know, is basically coming from the best practices that have been observed to mitigate risk and the integrity of the electoral process. What are these types of recommendations, for instance, you know, to provide the user with the contextual information, like fact checking labels, if they are used by the systems. Reducing the providence of the information and provide information about the electoral process and important, very important, demon sizing the information. This was the legal framework obliged to take a litigation measure. It is an entry in giants and other key actors in order to limit the information. There's quite a wide set of commitment and measuring included in the code that address specific issues related to the information. Like the demonetization, increase user empowerment, enable fact checking, and also there's a lot of transparency around the function of the task force. There's also a task force that's composed of all signatories. That's on the weekly basis discuss the implementation of the code. Here you have the least of the signatories. Of course, here we have a key platform like Google and Microsoft and TikTok and trade organisations that represent also the sector. Now X was a former signatory and they decided to leave the code after the new ownership came in to place. The rapid response system was activated. It was a response with the signatory of the code with a large online platform and large search engine and signatory of the code. This system allows to fact check and so to flag content that was then considered for the electoral process. On the other side, platform where the commitment to analyse the flags and decide on the basis of the term of services if an action needs to be taken. This system was activated for the European election for the French election and for the Romanian election. Now there was also a similar mechanism that was put in place on the model direction. Basically going towards the end of my presentation. I just wanted to have a focus on the remaining election. We know they've been very difficult. It was a lot of foreign interference on this election. This was shown also by the number of flags the rapid response system had seen with more than 1,000 flags exchanged. Coming from the citizen organisations and fact check to the major platforms. Yes, I'm concluding. Very quickly, I mentioned the European digital media observatory. This is the finals by the commission. We have more than 30 million euros on the initiative. He has 14 hubs and a system of monitoring this information across the EU and doing investigation. I'm sure that Giovanni will give you the details. GIACOMO MAZZONE: Thank you very much. The questions will be at end. You've been asked to complement the picture.
>> Yes. I'll show you the presentation on the screen. We'll have a couple of moments of silence.
GIACOMO MAZZONE: Something is happening.
SPEAKER: Good afternoon. It is great for me to present our work in such an important venue as the 2024 Internet Governance Forum. My time is short. I'll dive right in. First of all, I would like to present a couple of key pieces of information about the observatory which is usually known through its acronym. Alberto mentioned it briefly. Since I have a couple of minutes more, I'll try to present it. The observatory was established in 2020 as a project co founded by the European Union. It is comprised of the universities, fact checkers, policy companies, and media literacy experts. Beside the coordinating core, there are 14 hubs connected to EDMO covering the country and sometimes larger area of Europe. The concept behind the hubs is to cover the variety of languages and specificity of media landscapes across the Union. Since the challenges posed by disinformation is clearly different from Slovakia to Portugal. Hopefully we can tackle the strategies. He is able to carry out many different platforms. Such as one to monitor the information across the continent it is to coordinate and promote media literacy activities such as the one that I'll present in a minute. They will continue to research and research landscape in the field with a special expert in promoting that access to researchers and on this we have the letters and thanks to the work and updated in the field about the journalist and experts about the information narratives and the most important issues of the day providing the connecting work which is usually made difficult at the European level. For example, because of the great linguistic variety of Europe. They had items about how the information tried to exploit problems with the information in Spain on election day, as elected by Spanish fact checkers. Another one with the candidates campaign was by the coordinating behaviors by the accounts and done linking to the report in the newsletter and the third one about the rise of information targeting the 2024 Paris Olympics which was spotted by the big French in CDFP. At the same time, we published weekly reports and we promoted the Europe wide media literacy effort with the #beelectionsmart. The Be Election Smart campaign was an initiative to support citizens in finding support ahead of the election itself. From 20 29th of April, it established all of the accounts. It covers all EU member states. Most of the activities were new ideas. They were never tried out at European scale. Some of these activities turned out to be successful pilot projects that we have seen adapt to new scenarios. For example, the Romanian elections that were mentioned before in November of 2024, made headlines in Europe and beyond when a previously little known candidate was able to finish first in the popular vote. Citing interference and tampering with regularity of the campaign, on December 6, 2024, the country's constitutional court nulled the results in order for the elections to the held again. They will have elections in June. In the context of the Romanian elections, we activated the response system mechanism through which the members of the community can proactively flag to large online platforms, suspicious and troublesome cases. It is up to the platforms to take action or not according to the terms of service. However on the EDMO web site, we translate and made available to the community three analyst in the Romanian elections. Just yesterday we made progress based on the eye which is currently in its testing phase. Two Romanian fact checkers. It is quite good at spotting networks of social media accounts and carry on the coordinating campaigns of the dissemination of the suspicious content. This is the result of cooperation between many actors. They were involved in the tool which is part of the community that will broad which is one of the original hubs and admiral acting as coordinator and facilitator of the exchanges. It is tiresome, intensive, and also very rewarding work. Let me conclude with some advice and give you some food for thought. First, the information is an issue that naturally crosses disciplines and field. It is crucial to build the network. It is also necessary to find means to connect that together. They adjust the newsletter. Three, to help you communicate the results and attract new forces for your effort, you'll need to produce shareable outputs. There are many. You will find all of them on the EDMO.edu web site.
Finally, I'll invite you to get in contact with EDMO to know about our experience, difficulties, few successes, and many challenges. We are very happy to share what we know and to learn about what we do not. Thank you for your attention.
GIACOMO MAZZONE: Thank you very much, Giovanni.
PAULA GORI: Thank you for sharing the work that we're doing at EDMO. I think it was an impressive presentation. I liked how you concluded. We are trying to learn. We are open to learn even more. That's why events like today are actually very important also to learn from other experiences. But now I would like to give the floor to Delphine, the spokesperson of the European Parliament. In June we had the elections. It was an important moment for the parliament. The floor is yours, Delphine.
DELPHINE COLARD: Thank you. Thank you for the opportunity to join you remotely and to talk here today. Indeed the parliament has been active in the area since 2015. As we learned later, it has been pushing for legislations. Legislations that my colleagues outlined and protect citizens from detrimental and harmful effects of the Internet and promoting freedom of speech and ensuring consumers have access to trustworthy information. This was at core of the priorities in the past legislature. It will remain the core of the legislature that just started. If we look at European elections that took place last June, the European elections are conducted hand in hand with the 27 EU member states. An important part was to counter and prevent the information from harming the electoral process. And the ID was to anticipate potential disruption that could be expected in connection with the European elections. We cooperated in this end with the other institutions from the colleagues of the commission that you just heard at the European member states in a rapid alert system and fact checking community to get the complete picture. I have to say the in depth analysis that was provided by the European Digital Service Act was really worthwhile. With the parliament and national elections, we knew we could attempt to distrust in the electoral processes, alleging that election is rigged or spreading false voting instructions. What we wanted was to make sure that European citizens were exposed to factual and trustworthy information. To empower them to recognise the information and give them some tools to tackle it. Inspired we were inspired by many good practices in the different member states. With a similar example from Lithonia, we set up the elections explaining the technicalities of the elections. I hope you see the slide. I don't.
GIACOMO MAZZONE: Yes. We can see.
DELPHINE COLARD: Perfect. The measures put in place established free and fair elections. The web site was to inform about the different aspects of the election integrity from information manipulation to data security. The web site was also equipping voters with tools with this information. This is one example. But, of course, what we did also is we developed a series of pre bunking videos explaining how to avoid common disinformation techniques. For example, taking advantage of strong emotions to many polarising attempts to flooding the information attempt with the version of the same event. Thanks to the service or partners in foreign policy, the videos were also available in non EU languages including Ukrainian, Chinese, Arabic, and Russian. We have a short version that we can show for 40 seconds now.
GIACOMO MAZZONE: Thank you.
DELPHINE COLARD: And it is coming.
(Video playing).
>> This information can be a threat to the democracy. People who want to exploit is anger, fear, or excitement. When we feel strong emotions, we're more likely to hit the share button. By doing this, we help spread the information to the friends and families. What can you do? Watch out for very emotional content, such as sensational headlines, strong language, and dramatical. Question what you see and read before you share. Things you would question when somebody told you face to face you should also question online. Take a step back and pass the reflex to react without thinking.
DELPHINE COLARD: This was an example. This is what is spread throughout the period before the elections. And it was shares via social media and TikTok. We offered briefings for youth and educators and journalist and content creators. The idea there was to engage different audiences, providing tips and tricks on how to avoid. It was also reaching out of the members in the European Parliament. We did a specific guide. One of the things we're proud of is European voters. What we call the European Parliament and ambassador programme, they are for students. Because, you know, raising awareness of the threats is a key and long lasting solution that requires the implication of the society as a whole and starts with education. These were two examples or three examples. One element that I want to highlight is also the importance we push on putting and having strong relationships with private entities, Civil Society Organisations to convey the importance of voting. This was spread as widely as possible. It was tech companies and other companies as the code of practice mentioned. It was instrumental to have them on board. I want to mention and the importance of strong independence and holistic media is instrumental in the fight. There was several legislations that were passed at the European level. The European Media Freedom Act was a directive to protect journalist against abusive lawsuits. We tried to support media in their work from briefings, invitations, or grants. I don't know. We saw a lot of things during the last elections. Maybe not the tsunami that was potentially feared. But there was an increase of information and manipulation attempts targeting the European elections. We have not detected any that seems capable of seriously distorting the context. This is what we shared with new institutions and other institutions and the European Digital Services Act. We have to remain vigilant and continue to monitor. The effect of the information is not only the runoff. It is the big moment. It is slow dripping. We look at what recently happened with the Romanian elections. This is something we are scrutinising. They are eager to have more information and passed several legislations to call step up in the area, stepping up efforts. They will be that's a new dimension for the week. There will be a special committee that will focus on the European democratic shields and really to assess existing and planned legislation related to the protection of democracy. They are asking to deal with designs in social media platforms. There's a lot of activity. Next week in the plenary, there will be two debates. Specifically on the information and especially the information during the electoral period. Maybe to conclude with some learnings. Information, manipulation, and information manipulators. They really see election as an opportunity to advance their own goals by leaders and exploiting existing political issues and credibility in the democratic system and its institution. Second good intentions and volunteer reactions are not enough. Legislation and regulations play an instrumental part for parliament that has been and will remain a key actor as co legislator to shape laws that are fit for the digital age. Third in the same as Giovanni already underlined, it is really important to continue to implement a hole of society approach learning from each other's practices, and programmes, and to double the efforts to help society. This is what we have done in the European Parliament. Thank you.
PAULA GORI: Thank you, Delphine. As you said, maybe it wasn't the tsunami. As you rightly underlined it, EDMO said the information is rather drop after drop. We cannot just focus ahead of the elections. It is a longer process. I think right after this two presentations, I mean there's so many keywords. Media journalist, fact checking, media literacy, emotions, addictive design, media literacy, digital platforms and so on. You understand why the whole of society approach and the disciplinary approach. For example, if we know emotions play a certain role thanks to the specific research in the field. If we know about fact checking, it is another field of research. That's why institutions like EDMO in collaboration with other organisations like the parliament and commission and platforms as well and organisations and so on are so important. Now I mean after this lots of words about the EU. We thought it would be interesting to focus on other parts of the world. As we were saying, this was a year of elections. I'm happy to give it over to Benjamin Schultz who will focus on the U.S.
The floor is yours.
BENJAMIN SCHULTZ: Awesome. Thank you, Paula. I'm going share my screen really quick. If it wants to go. Okay. I think it looks like that was a success. Can everyone see? Okay. Lovely. Thank you all very much. It is wonderful to be here. Speaking with such a geographically and so many other ways diverse crowd from all over the world and different backgrounds all here to talk about disinformation and how do we make the Internet a better, safer place? Thank you for having me.
Obviously, in the U.S. we just had an election. I think, you know, putting it blunt re, I think it is the result that surprised a lot of people. You know, really this year's election in the states was a weird one. In many ways with we regressed. I'm going to explain this further on in to my seven or eight minutes. This is just a taste of what's to come. We're in a different place in the states than four years ago and even eight years ago in terms of taking action on disinformation in terms of platforms playing an active role in content moderation and trust and safety. It is how things have gone in Europe. It is an interesting phenomenon taking place. With that, I'll jump into the slides here. This gif is one of my favorites. I think it really accurately despite it being funny, it really accurately describes the current U.S. approach to disinformation and online and all kinds. Whack a troll is what we at the American Sunlight Project call it. As you can see, the cat is trying to tap the fingers as they are popping up. As the cat tries to tap them, they go away. The cat continues to do this for as long as I have the slide up here. This sums up where we are at. In the last year, really two years, we have seen platforms pretty much just give us entirely. And this is not specific to any particular platform. Although I will say that some certainly are doing more giving up than others. You know, for a lack of for not wanting to impugn anyone's integrity, I won't name the platform. We can all take a guess. This is really problematic. We've seen a massive proliferation of hate speech, of false information of all kinds from false polling place locations to attacks against elected officials and doxxing and things like this. This has become common place in the last year or two. The political system is about as toxic as it has been, certainly in my life time. It has certainly become worse in the last couple of years. This has been buoyed. We've seen platforms go along with this. Which I think is very important than how things have played out in Europe. In the states, we do not have regulation such as the Digital Services Act or anything of the sort. Platforms just from the headlines have surrounded determination of trust and safety teams that has taken place. We've also seen the rise of a narrative of sensorship from the far right. Also limited sections of the far left. They have started to claim that any content moderation or any action against false and malicious claims is kind of a censorship. In the states, we have the first amendment which protects the right to freedom of speech and expression and assembly. This is a very American right. This is something I think every single American supports. The right to freely express yourself.
Where we're seeing some conflict politically in the states and, you know, platforms and the government and the incoming administration is sort of where that where the boundary is; right? For instance, in the States right now, my organisation, the American Sunlight Project just released a new report last week about how deepfakes have affected members of Congress. One in six women in Congress in the states are subject currently right now on the Internet to sexually explicit deepfakes. What we've seen is the Senate has passed bills already that would regulate the deepfakes and make it a criminal offence to spread them. You are using this to denigrate them and betray them as pornographic material. This is a violation of the free speech is how we've seen objection. This is not one way or another. This is painting a picture of where we're at in the states. And sorry. I should have skipped ahead earlier on the slide here. We've seen this kind of democratization in the most extreme way of artificial intelligence, deepfake technology. Also other types of content too. We've found plenty of evidence of foreign bot networks that have played a significant role on various social media platforms in the election cycle. To the extent that they change voting behavior, this is impossible to measure. We have plenty of evidence. Not just us, but every organisation working in the field and in the states. This type of content has been pervasive and getting in to people's feeds whether it is on TikTok or X or Instagram or Facebook or whatever. We've seen the content that's is emotional or gets people riled up. Makes them want to click more and scroll more. Algorithms favor this kind of content. We've seen plenty of malicious, fake, and actually also illegal content making their way in to people's feeds. So again to the extent that changed voting behavior? Not sure. Certainly people have been exposed to false and malicious content in the election cycle. Going back to the deepfake issue. Not just in terms of deepfake actually explicit material, but we've seen numerous instances of election officials and even Joe Biden himself being spoofed and imitated. Without taking a political position, just describing where the United States is at, it is completely legal to make the material and create and disburse it. The headline, the third one on the right here, and New Hampshire officials investigate, et cetera. There was a criminal investigation here. It is illegal to sort of interfere this blatantly in an election. There's plenty of other instances that haven't been prosecuted with the type of behavior happening. We have zero regulation on deepfakes and any kind of election based integrity measures. We have pretty much none. Which again is pretty much in direct opposite to how Europe has, you know, approached the issue. I think certainly in the states our institutions are structured differently. It is much harder for us to implement these kinds of measures that DSR and GDPR, et cetera. Nonetheless, this really just highlights the issue.
GIACOMO MAZZONE: Thank you. Can you go to the close please?
BENJAMIN SCHULTZ: Yes. We'll close it up. One thing, I'll wrap it up very quickly, Kamala Harris ran for President in 2020. Again not getting political. She has been the most attacked person in terms of gender and identity in the last four or five years in the States. One study my boss authored found that all gender and identity based attacks, Kamala Harris received 80% of the attacks. This goes back to the toxicity of the American political system, it highlights that. It makes it difficult to agree on a set of regulations and rules for technologies and platforms, et cetera, et cetera. With that, I'll wrap it up. For the immediate term, the outlook in the states is not so good. Hopefully we get united and prove the feeds and political system. Thank you very much.
PAULA GORI: Thank you, Benjamin. We're running a little late, I'll give it immediately to Philile. We are moving again geographically. The floor is yours.
PHILILE NTOMBELA: Good day, everybody. I want to share my screen as well. Then I'll get started. If it does share. Can you see my screen at all?
GIACOMO MAZZONE: Not yet.
PHILILE NTOMBELA: Wait here. Can you see it now?
GIACOMO MAZZONE: Yes.
PHILILE NTOMBELA: Fantastic. Thank you for having me. I'm Philile Ntombela. We have offices of fact check in Nigeria and Cenidel. It is held in mind and often a political one. The people shared no information whatsoever. Patterns across the continent that we've seen through the election year. Targeting journalist in the judiciary and all of the other bodies was a powerful tactic. Journalists were accused of bias. We have something in the elections coalition which included in Africa and who would either try to fact check themselves and train them beforehand as part of the organisations training systems or we help them to fact check. Often they were accused of bias. We had rumors. It took it to the constitutional court. It is the highest in the country. Stating they were being marginalised and basically receiving unfair treatment. The constitutional court ruled against that. It wasn't true. It is more of a publicity stunt to make people wonder whether the constitutional court is independent and the electoral commission. This is the last part. Which again leads back to the court case with the constitutional court. People are more connected. Voters are still running across information. After lot of people don't have it. Language is a barrier. There's more than 2,000 languages across the different states. Those are the efficient languages. Sorry. Moving back here quickly. We found a report by the international telecommunications union which showed that Africa, even in 2022, Africa suffered from the digital divide. We had the lowest level of people that had Internet connectivity. That means those people whatever information they receive, they don't have the opportunity to look it up or double check it or send it to a fact check. Finally we are platform accident account. We found that I don't know if I'm supposed to mention. In several platforms, people were able to share the information. The information is unchecked. One of the platforms obviously has a note. It doesn't hear the spread of the same content or the same posts that show the information. We are part of the fact checking programme. On that side, we found that we tend to we were able to help them spread. In Africa, it is very difficult for people to access the information in other ways. I find there's information that is incorrect. I believe in the share. It is nice seeing somebody that has no idea finding if it is important. If it is true, then you'll find the information spreads even faster. That forms the algorithms. They need to be able to pick up common phrases for the information, particularly in election years. But in general and abroad. The biggest allegation, the biggest piece of misinformation this year was voter allegations. It started it knowing he was a charismatic character. He's had a lot of problem with legally and the IUCs. Starting with the issue that would be rigged before the election season started. The media ended. It is not fact checked or tried to say this person said. It was more just shared the basis of what he said. This is training on social media. They took the statement made by the politician and made this the headline. It bribes the idea rather than creating the sense of this is what somebody said. On the right, they named the big lie. We found that between 25 May to June, these and this cloud was basically taking over social media with the biggest players and the biggest causes being the ones in purple and later the ones in the turquoise.
GIACOMO MAZZONE: Sorry. We are long. We have the last speaker. The next session is starting. Thank you.
PHILILE NTOMBELA: Sure. I'll speak about regulations on misinformation. It can be quite damaging. They turn to Internet encryption. There's misinformation instead of disinformation. We had an Accra which was created this year at the Africa Facts Summit. Basically reaching offline and expanding to reliable information. The fact checkers from facing harassment and collaborating the partners to innovate. This for us was more a collaborative space rather than the legal space and that is all I have to say. Thank you so much.
GIACOMO MAZZONE: Thank you, Philile. Sorry for that. The last speaker now is Claes.
CLAES DE VREESE: Yeah. I'm going to do something radical. The next session is starting. I work at University of Amsterdam. I was going to talk from a specific risk around Generative AI and elections in 2024. Let me just give you my take home message rather than the whole presentation at this point in the panel. I think we're somewhere between relief and high alert. If you look across elections of 2024, it will be hard to identify the risk. On the one hand, that was a big relief. That's very different from the expectations going in to 2024. There were true and genuine fears about elections being over turned through Generative AI. That didn't happen. At the same time, all of the evidence and that's the evidence that would be in the slides that I will then skip. That's also not been a single election in 2024 worldwide where Generative AI and AI generated material has not played a role. That's the take home message really of this discussion on AI. There's a certain sense of relief that 2024 did not become the absolute catastrophe year. There was absence of the regulations in the area and technology that was available and that was deployed, but maybe not with the effects that were expected in 2024. Does that mean that the AI discussion is over as we move in to the big election year in 2025? Absolutely not. It is important to look at the impact of Generative AI. They will continue to do so in 2025 as we see elections that take place in that year. It is important to not only look at persuasion of voters, but to see what kind of role AI is playing in the ecosystem of elections. Whether that's in the donation phase or in the mobilization phase whether it is spreading disinformation about your political opponent, and whether or not this is igniting and fueling into already existing conflict lines. Let that be the take home message for 2025. 2024 did not become the AI catastrophe that was predicted, I believe that as we move in to 2025, there's all of the reason to continue the work to see how these technologies are being deployed across elections and this is something that we should do collaboratively also with centers and researchers and Civil Society from outside of the European Union. To really get a better grasp on the impact of AI on elections.
GIACOMO MAZZONE: thank you very much, Claes. Thank you for sacrificing your time. Just one question to the two non European speakers. Do you think, Philile and Ben, if you have a framework like in Europe it would make your life easier or not?
PHILILE NTOMBELA: I'll go first. For us, no, it wouldn't. For the reasons in the presentation. There's a huge history of censorship and suppression. Once you create the rule, one person is able to use the law to manipulate people that are trying to spread proper information. It can go wrong in so many ways. This is why we came up with the declaration at the Africa Fact Summit. They signed up to say they would rather collaborate to try to fight the information and misinformation from immediate literacy and outreach and international. Still within the perspective rather than the laws, because of how they can be manipulated because of how we've seen with other laws in the country already.
GIACOMO MAZZONE: Thank you. Ben. 30 seconds.
BENJAMIN SCHULTZ: Generally, yes. I think it would help us in the States. Implementing something totally identical to the DSA. I think that would be difficult to legally deal with. I think it would face a lot of kind of chopping down. But particularly around protection of researchers and data access for researchers. This would be helpful and enable Civil Society to do what we were doing in 2020 which is analysing content from platforms and reporting online. Which we can't do today.
GIACOMO MAZZONE: Thank you very much. Thank you to the speakers. Thank you for the people in the room to the patience. Apologies to the speakers in the next session. We ran over. We are not taking questions. We will be outside the room, if you need help. You can raise any point. Thank you.