IGF 2024-Day 1-Workshop Room 4- WS133 Better products and policies through stakeholder engagement-- RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> JIM PRENDERGAST: Hello to our participants online. We're waiting a few more minutes after the lunch break so we're going to let folks trickle back into the venue and join here in the room. Hang with us and we'll get going in a few seconds.

(Pause).

   >> JIM PRENDERGAST: Okay. For those in the room, you have to put your headsets on, there is no piped audio, sorry for those online if I'm screaming. I will moderate my voice now that everyone will put their headsets on.

   Thank you and welcome to our session. My name is Jim Prendergast, I'll be serving as your moderator today, both online and offline. So I will be handling sort of questions here in the room as well as questions online. You know, in case you haven't figured out, you're in the session titled better products and policies through stakeholder engagement. We've got a great group of panelist for you this afternoon, and it's great to be back at the Internet Governance Forum, which is an annual event that really does embody stakeholder engagement.

   If you're not on channel 4, that's the channel you need to be on in the room. Sorry about that.

   The digital age presents with us incredible opportunities, but also significant challenges. As technology evolves, it becomes increasingly clear that technologies can't be developed in isolation. They require collaboration and input from a diverse group of stakeholders, from Civil Society, academia, policy sectors, and communities directly affected by these innovations.

   Stakeholder engagement is more than gathering input, it's about fostering dialogue and creating trust and creating solutions that are robust, socially inclusive and aligned with human rights and principles.

   When done well, it can lead to innovation and be sustainable and impactful. Today we're joined by a great panel of experts who are going to share their insights and been at the forefront of these efforts.

   First we have Richard Wingfield who is the director and technology of human rights at BSR. He's going to discuss how stakeholder involvement can align products with society values.

   I'm going to hit pause because can we get the presentation on Zoom? There we go. Thank you.

   We're having a little bit of problems with the slides. Just go to the next slide. We thought we had all the technical issues worked out yesterday, but clearly we have not.

   I'm going to continue to introduce our panelist and we'll get to the next slide. After Richard, we'll hear from Thobekile Matimbe, senior manager for partnerships and engagements at the Paradigm Initiative. She will share insights into the role of external expertise and policy development.

   And after that we will have Charles Bradley who's manager of knowledge and information at Google. He will share lessons learned from sharing consumer trust engagement. And finally Fiona Alexander, and she is in residence here in the room with me at American University. And she brings the governmental perspective from her time with the U.S. Department of Commerce.

   So before we dive in, I want to encourage everyone to actively participate, whether you're here in the room or online. We think that, you know, your perspectives, your thoughts, and your questions will make this a much better session, a much more robust session.

   So let's get started by hearing from our panelists. First I'm going to Richard who will help us set the stage for this session by discussing the framework that he and BSR have developed.

   Richard.

   >> RICHARD WINGFIELD: Great. Thank you very much, Jim. Really good to be part of this session. I'm sorry that I can't be with you all in person there.

   So my name is Richard Wingfield, I'm the director of technology and human rights at BSR. And BSR is a global sustainability organization. We work with hundreds of the world's largest companies on a spectrum of sustainability topics, everything from addressing climate change, environmental impacts, to human rights. And I sit on our human rights team and lead our work with technology companies on how to act responsibly as a company and to build products and services that align with international human rights standards.

   So really pleased to be part of this conversation, because stakeholder engagement is a critical part of the way we work with companies at BSR.

   And the approach that we take is very much in line with a number of existing frameworks and standards that exist in relation to stakeholder engagement. And whether for companies in the technology sector or any other sector who are looking to be responsible businesses, to build trust and confidence, to align with what being a responsible business means, one of the most critical frameworks that's used and where we draw our information from is the UN guiding principles on business and human rights.

   So the idea that states have human rights responsibilities is one that is well established in international law, in various international treaties, but in the last few decades there was increasing concern from a number of external stakeholders that businesses also should be taking a role in making sure that the human rights of people affected by their businesses were respected as well.

   And that resulted about 15 years ago in the endorsement by the United Nations Human Rights Council of this framework called the UN guiding principles on business and human rights.

   So this is the framework that sets out what business and human rights looks like. It was various obligations that are imposed on states in terms of regulation of our businesses. It imposes responsibilities on businesses to respect human rights. And it also imposes expectations as to how individuals who have been adversely affected can seek a remedy for any harm that has been suffered.

   And the most critical part of the UN guiding principles as a framework when it comes to businesses is that pillar that is specifically about how businesses should respect human rights. Now, the reason why I'm sort of mentioning this framework in a conversation around stakeholder engagement is because the UN guiding principles on business and human rights make regular and explicit recognition of the importance of stakeholder engagement and meaningful stakeholder engagement when it comes to companies behaving responsibly.

   And this exists in a number of different aspects. One of the things that we do a lot of work with companies at BSR is to try to think through the way that companies have human rights impact at all. And those can be risks to human rights. For example, the way the use or misuse of a particular technology might cause harm to somebody. Perhaps restrictions on their freedom of expression or impacts upon their rights to privacy.

   We also look at the way that companies can advance human rights, the way that different technologies can be developed and used in ways which advance the societal goals. For example, supporting freedom of expression or enabling education or improving health care.

   So looking at the way that the actual technology products and services can both improve but also create risks to human rights, but we also look at the way that companies' own policies are also relevant here.

   And obviously company policies are instrumental in the way that technologies are designed, the way they're used. These can be everything from a company's AI principles which might govern the way it develops and uses AI. If we're looking at social media or online platforms, the rules that they impose as to how people can and cannot use the platform in different ways.

   So there are a range of different ways that companies can sort of have impacts upon human rights. And what the guiding principles say is that in trying to understand those impacts, you need to speak to the people who are actually ultimately affected. So when we're working with a company at BSR, we put a huge amount of effort into    alongside the company talking to and working with stakeholders to understand the risks that might be connected to a particular company. So what's happening in practice in different parts of the world and with different communities.

   The opportunities that can be provided, so the way that different technologies are being used in ways that can create benefit for communities.

   But also the company's own policies. So the way that the rules of the company sets relating to how they develop and use technology or the way that users can and cannot use their technologies, how they might also themselves be having impacts upon human rights.

   So what does this look like in practice?

   So one of the complicating aspects of technology as a sector is that so many people are affected. And often the impact of technology are global in nature.

   So if you're looking at a large online platform that might be used by people across the world, potentially hundreds of millions or even billions of people across the world, that's a huge number of people who might potentially be affected by the way that that service operates or by the rules that that company imposes.

   We really try to prioritize our stakeholder engagement with the communities most at risk. We know for example there are particular groups that are particularly harmful to human rights. We know for example that people with disabilities have been historically marginalized, may not be vulnerable to hate speech or vulnerable to person is groups because of AI systems because of the lack of data that was used connected to that group which AI systems were created.

   We try to prioritize our stakeholder engagement working with those communities and groups that are going to be particularly affected by the risk or particularly vulnerable to that risk.

   So that might mean working with women's rights organizations. It might mean working with organizations that support persons with disabilities. It might mean working with groups representing those who are vulnerable to discrimination within different societies.

   But we also know that the ways that technologies are used and the ways that the rules that companies impose can be felt very differently in different parts of the world. There are different cultural contexts. There are different language issues depending on the company, the primary language that it uses.

   There are different levels of digital literacy in different parts of the world, so familiarity with technology and the way that it can be used or misused.

   So we also try to make sure that we take a global approach to our engagement with stakeholder and that we talk with groups that can either speak to the experience of people in different parts of the world, or in some cases we talk directly to groups in certain parts of the world where their experiences may be different from    from elsewhere.

   And so that's the kind of approach we take to stakeholder engagement is really trying to particularly when you have hundreds of millions or billions of people who are using a platform or affected by a technology, prioritizing our engagement with those groups who are going to be most vulnerable or most at risk, but also making sure that we are geographically and culturally diverse so that we hear the full range of experiences and can provide recommendations that are nuanced appropriately.

   What the UN guiding principles don't give a lot of detail on, however, is the actual mechanics of stakeholder engagement. Yes, they talk about the importance of talking to diverse range of stakeholders and meaningfully using what they tell you in the way that the company develops its technology or it creates or modifies it rules. But it doesn't really tell you how to do it in practice.

   We use range of different ways depending on the company we're work with and the issue in question. So it can be something like organizing one on one interviews. We might simply organize a number of one on one interviews with different organizations around the world. We ask some questions, talk to them about their concerns. We might get into some specificity about a particular rule or product, depending on the work that we're doing in question.

   We might also organize workshops where we bring together a broader range of people and sometimes that can be helpful because then you have a diversity of opinions within a room and people are able to counter each other or raise different perspectives or push back. So you get much more of a dialogue. Sometimes we'll use workshops or those broader interviews as a way of seeking engagement as well.

   We also know that there is a lot of stakeholder fatigue as an issue, so a lot of stakeholders constantly being asked to participate in interviews and meetings, so we also try to use existing spaces where people are talking about the issues that concern them. And the idea that's a great example of that, Right Con, Trust Con, there are many existing spaces where NGOs and other stakeholders come and talk about the issues that are most important to them, including the impacts of different technologies and different technology companies.

   And so quite often we will come to these events, run sessions ourselves, participate in other sessions, and use that as an opportunity of hearing directly from people on the ground.

   So we use a variety of different tactics and techniques to ensure that we are not only talking to a broad range of people, but also not adding additional burden and time to them, but using existing spaces wherever possible.

   What the UN guiding principles don't give a lot of guidance on is how you incorporate that feedback back into company decision making. I'm sure perhaps some of our company colleagues on this call will speak to decision making at a company is not a straightforward exercise, the design and creation of new technology products, the way they're launched, the way the policies are modified. These are complex processes the it's not always take one workshop back and quickly make changes as a result of it.

   Stakeholder engagement and feedback received will be one of a number of different source of input into the ultimate decision making of a company. So what we at BSR try to is make the feedback that we get from stakeholders as practical as possible. We try to make sure it's very clear in exactly what stakeholders would like to see from the company and how that can be measured and assessed in time.

   One of the things that we also try to do is to build long lasting relationships between companies and stakeholders as well. So simply bringing in one organization for one interview at a point in time and then never speaking to them again does not encourage a sort of a longstanding and trusting relationship. So we often try to make sure that companies are providing updates to the stakeholders on what's happened involving people later decision making and trying to create relationships rather than something which is merely transactional.

   So the UN guiding principles as a framework is a really helpful starting point in setting out that companies should engage with stakeholders. They should be used to understand the company's risk profile but also where there might be opportunities as well. And ensuring that there's a diversity of opinions in the range of stakeholders that you talk to. And then we tried to add a bit more practicality to that framework in terms of the engagements are like in practice and making sure that they're meaningful and impactful rather than transactional.

   So that's the approach that we take. Jim, maybe I'll pass back to you for our next speaker at this point.

   >> JIM PRENDERGAST: Yeah, great. Thanks, Richard.

   So yeah, turning to our next speaker, Thobekile Matimbe, I know you're heavily involved with DRIF, which is the human rights forum, could you share with us your concept of stakeholder engagement and maybe anything that might be relevant to our discussion today?

   >> THOBEKILE MATIMBE: Thank you so much. So hi, everybody, I'm Thobekile Matimbe and I work for Paradigm Initiative which is an organization which promotes digital rights and digital inclusion across the Global South and I work for senior management partnerships and engagements. This is a very important conversation around stakeholder engagement, and happy that Richard was able to unpack the UN guiding principles and rights and what they say with regards to corporate responsibilities which is critical.

   And one of the things that points to corporate responsibility, is obviously stakeholder engagement is something that's critical to ensuring that we have better products out there and taking into consideration human rights.

   As I reflect on that topic on stakeholder engagements and better products, it's important to articulate that it is important, I think, for the private sector to sort of, you know, within their, you know, quest for due diligence to be able to think about what stakeholder engagement looks like. And from where I'm sitting, one thing that's important is meaningful stakeholder engagement and not just stakeholder engagement where it's just sort of ticking the boxes, but how can engagements become more and more meaningful.

   Thinking about it, it's so important for companies to think about how they can meet the community where they are as opposed to, perhaps, you know, scheduled meetings that come in once probably as a way of transactions. Richard mentioned transactional engagements. But more proactive meeting the communities where they are. That's what we have is a forum that hosts annually and looking forward to hosting the 12th edition in Zambia next year from 21, April, to 1, May. But what happens at DRIF is we have multi stakeholder engagements coming into the room to discuss trends and developments in the civil rights space. We have governments come in, media, technology companies as well. But we've not seen as much companies coming on board to engage with the community.

   This year alone, we held the forum in Ghana and we had just almost 600 participants who were there and from not just Africa, but, you know, over around 40 countries at DRIF, and we had attendees from Africa and Global South spaces as well.

   It's a rich platform where any product designer would want to be there to be able to engage with the community interface and discuss products.

   But thinking about it, the key players really with regards to better products, this would be those who use the products and that's why I'm saying that it's important for who I think proactively about engaging with those who use the products to be able to say what they think about when they're design products and how they can improve those products better and what better place to be in platforms where, you know, there are different stakeholders that can be able to input in the design of the technology.

   I'd also highlight one critical thing in the design process is obviously the do not harm principle in the context of human rights and who are those bearing the brunt of bad products unleashed on the market. It's the users of those technologies, those who are marginalized groups, minority groups. And their voices can only be heard in spaces where human rights are discussed, even discussing as well persons with disabilities, what are their challenges.

   And this is why I think it's important for more and more companies to find themselves in platforms where there is group conversations happening, and DRIF is one such platform.

   One key thing when we're looking at policies themselves, maybe community standards, for instance, if we're looking at social media platforms, would find that they come up with community standards. It's important to circle back to the community and say this is what we have and is it still applicable. Technology does not wait, it's always evolving, so it's important to have that proactively.

   I'll give an example. Just recently we had a very interesting engagement with one telecommunications company that reached out to us after seeing one of our reports that we'd done on surveillance in Africa. They were so keen and they laid their hands on this research and literally reached out and requested a meeting with us. Which was a proactive action, as opposed to reactionary stakeholder engagement process.

   Where, perhaps if we reached out to them and said look, there's this challenge we've seen this has happened in this country based on your product    but they were proactive and said let's have a conversation. It was one of I'll best engagements in the private sector around community standard.

   I think as policies are being developed by different private sector actors, it's important to figure out ways that the community can use our product where can we get to and reach the community to get feedback on what we're turning out so we put out something that's good and robust and rights respecting, mitigating as well human rights impacts.

   It's also important to reflect on the outcomes that have come from the Digital Rights and Inclusion Forum. Every year we come up with community recommendations and we gather these from the people who attend the Digital Rights and Inclusion Forum from across, you know, underserved communities, across the global south and they give input.

   I think for the private sector has been that need for that engagement around policies and how they're developed to better strengthen security and safety when we're talking about trust as well and safety, it's something that's critical to the context of, you know, products and how they can be better and better serve the users themselves.

   I think one thing as well that I would highlight that's also come up is the importance of, you know, having, you know, policies that ensure that, you know, variable groups as well are not left behind. You have your human rights defenders or media who feel that sometimes when policies are being developed they're not really addressing some of the lived realities that they face.

   So I think reflecting more on the do no harm principle is something that I want to echo, it's something that's really important. And it's actually, you know, something that should be embedded in every point of, you know, the product design process. So it's really critical that we continue to have this conversation and also hear from colleagues within the private sector as well with regards to, you know, their views as well around proactive stakeholder engagement as opposed to stakeholder engagements that are reactive or just, you know, ticking of the boxes and what they are doing as well to ensure that they are able to meet the community where the community is.

   And it's something as well that I'll highlight even as I conclude my reflections, that due diligence is something that demonstrates corporate responsibility is important primarily as a corporate practice to what better products within themselves, you know, respect human rights and echo the importance of mitigating human rights impact.

   I think I'll leave it at that for now and I'll post in the chat as well the link to more about the Digital Rights and Inclusion Forum. And currently we actually have a call that's out for session proposal, so that's a good opportunity for those in the private sector who would want engage around their products or discuss more about them with the community to be able to possibly consider being at the Digital Rights and Inclusion Forum in Zambia next year so that we continue to have meaningful, proactive stakeholder engagement.

   >> JIM PRENDERGAST: Thank you very much to both of you. Now I'm going to turn the perspective a little bit away from product development to policy development. Fiona who is sitting across the table from me here in the room, I don't want    you spent a long time at the Department of Commerce. I've known you for a long time, and you know, you were heavily engaged in stakeholder engagement. The U.S. Government has been a leader in that aspect.

   So can you share with us sort of what you found worked with stakeholder engagement when it comes to developing government policies and maybe what didn't?

   >> FIONA ALEXANDER: Sure, happy to. Let me turn one ear off    maybe I'll take both off. It's hard to hear yourself when you're talking. Thanks, Jim, for inviting me and to everyone remotely, you're missing a beautiful venue. Sorry you're not here to join us for today's conversation.

   I was in the U.S. Department of Commerce for about 20 years. In terms of the conversation for today about better policy, from my perspective through stakeholder engagement, it's important to know at least in the U.S. Government system there's a couple different ways and processes that are used.

   So for regulation under our regulatory regimes, you know, our legislature will pass a law, but our independent regulators or other parts of the government will do a lot of stakeholder engagement to produce the specifics of how a law is implemented through regulation. And we actually have a pretty prescribed process for that through the Administrative Procedures Act where they'll get an assignment out, they'll have to do draft rules and there's like 45 days, 90 days stakeholder feedback, that kind of stuff. That's on the regulatory side in the United States.

   And that's across sectors, it's not just technology sectors, but all of our sector regulatory processes work that way.

   Where it's different is respect to broader policy setting. In the agency I worked at in the Department of Commerce, NTAA is a big proponent and has been with the model as things like that. But we talked about government as a convener. There's the idea of similarly seeking public input or stakeholder engagement or what should be the priorities or policies of your office or administration. Sometimes you do a public meeting. Sometimes you do a notice of inquiry and ask for written feedback. And the reality is it's government policy or priority setting and it effects what the team does or how it happens across the world or bilateral engagement.

   But then there's government as a convener in terms of trying to set policy or participate in policy. I had the great experience of being involved and responsible for the U.S. Government's relationship with ICANN. I was very much involved in the stewards of transition, which is a multi stakeholder decision making process versus a multi stakeholder decision process.

   But something that's not as well known, they tried to deploy this domestically and it was more challenging than was globally. The example I give is we were trying to actually implement some sort of baseline privacy rights without Congressional legislation. And in the absence of that, tried to convene stakeholders and said, okay, what should we be talking about and what do you all want to talk about and what policies do you all want to set.

   I'll say the very first meeting of that was very strange for a lot of people, because they were much more used to what I described at the outset, the Administrator Procedures Act where government comes in and says, here's the particular problem we're trying to solve, here's some of our initial thinking, what do you think.

   In this case we were like, no, no, by boss at the time were like, we're going to let the stakeholders decide what they want to do and rules did they want to set. Some of them yielded voluntary codes for mobile apps transparency. But a couple of those stakeholder processes fell apart.

   It was a learning experience for the team but didn't yield policy outcomes because the stakeholders themselves didn't have a particular focus they wanted to talk about. I think when we're talking about better policies through stakeholder engagement, some of the lessons learned at least from my experience depending on how you're handling it and setting aside our regulatory approach, if you're going to try to deploy a multi stakeholder process or stakeholder engagement, it's better when you have a targeted question you're asking people.

   Just like if you're developing a particular product, if you're developing a particular policy, it tends to be more useful. There has to be government will to do this approach because there's people that challenge the approach and people that when they don't get when they want from the approach will try to go around you to get what they want. The strong commitment to political will is an important thing.

   Someone else mentioned as well, it can't be a check the box exercise. You have to always be talking to people. It can't be I have this particular problem, I'm going to talk to you now. You've got to build relationships and sustain the relationships and keep working with people so that you understand each other and can talk.

   There's also got to be enough resources, not just in the sense of stakeholders being able to participate, which can be a challenge if you want a broader range of stakeholders, not everybody's resource is the same. The same is true of governments, enough staff and people and resource to do them.

   And then I go back to at least in my experience where better policy through stakeholder engagement has occurred when the questions have been more focused and the problem set is more focused. It lends itself to inertia sometimes. The other thing that helps is having a deadline. There's a clear deadline it drives people to particular outcomes.

   That was kind of my takeaway from my experiences. Maybe I'll end there and keep the conversation going.

   >> JIM PRENDERGAST: Thanks, Fiona. As you were speaking, I could see Charles reacting on screen, deadlines and political will. Let's flip it back to product development. I see you reacting to a lot of what Fiona says. Why don't you say what Fiona says at Google about this process that you've undertaken.

   >> CHARLES BRADLEY: Absolutely, yeah. Hi, everyone, I'm Charles. I'm the manager for trust strategy for products here at Google. Just a bit of comment on what that means. Knowledge information products are our search, maps, news, anything that connects people with information rather than hardware or cloud work.

   And manager of trust strategy, well our role on our team is to shape our products and our product strategy so that we continue to build trust with users. And so sort of a department that was built about three or four years ago and a fundamental part of that is stakeholder engagement.

   We built a program called our external research program which is all about ensuring that we get meaningful expertise into the product development lifecycle in a company that's moving at a million miles an hour at all times.

   We, having been on the other side of this conversation for many years now, I totally understand some of the challenges that have been raised by my fellow panelists. I was one of the stakeholders who was fatigued by being asked the same questions by different companies over and over. I was also one of the stakeholders who would come to consultations and be like I have no idea what you're talking about. You've been talking about this for three years and you're asking me to split a hair on something in 15 minutes. Maybe you could have helped me understand this a bit more.

   I think that may be why I was hired in the first place, to bring some of that stakeholder perspective into the product development lifecycle. Which is at Google run by product managers and engineers who are trying to build and ship product to millions and billions of users.

   When we come along and say hey, we need to be speaking to a wider range of expertise, often we get flags being thrown of that's going to slow us down, how do we get product market fit faster and et cetera, et cetera.

   So the program was built as a way of showing that if we do this right at the beginning, our products will be more successful and we'll get    we'll build greater trust when they launch rather than having to build that over time.

   And I want to talk about two sort of examples that we've done in 2024, which has been quite an exciting sort of year for us in this space.

   Firstly, was on circle to search. Circle to search is a new feature available in Android where on any service on an Android device you can long hold the bottom and circle a bit of your screen and that will send a search up to Google search.

   Why is this useful? Well, people are finding information in many different ways, and they're looking for access to information not just coming to search directly anymore, but coming from different platforms. And we thought it was a great way or by meeting users where they're at.

   So if you're on a video somewhere or you're on some other piece of content and you want to know a bit more about that, why don't you just circle it and off you go.

   Well, there were a number of key risks to launching this product, including some of the privacy risks, as you can imagine, associated with it.

   So our product manager who is leading this is very familiar with some of these risks, and forced an opening in the product development lifecycle to ensure that we went out and got expert feedback. And we got expert feedback through a number of one on one consultations to start with. Thinking through what Richard was talking about in terms of formats, the format of engagement has been very important to ensure that we can get direct and specific feedback from individuals, as well as group feedback.

   So went out and spoke to dozens of experts in human to human interaction, as well as privacy and human rights experts. And then we went and had a few one on one engagements with these experts. We brought them together in a group setting also.

   And we came back with five key things, which actually led to amendments to the product. So the first issue that we heard was how do you prevent unintentional, like, feature activation or sharing of data?

   So if you don't want to have this product on your phone, how can we stop that from unintentionally opening and sharing data with you?

   We ensured that there was explicit user action to launch and automatically activate it rather than it being auto on. And we provided access in the search itself to delete that search, because we wanted to make sure that people had the closest control to deletion.

   We also heard how do we ensure that users can access the controls over this information as well. So we integrated a delete the last 15 minutes search, which is something that we're trying to do more broadly across a number of our products. We understand that deleting your whole search history may not be what you want to do, but you might have searched for something that might be a present for someone that you want to quickly delete your last 15 minute search. So we integrated that as a feature.

   Meaningful disclosure, so what on earth is going on. How do we ensure there's a meaningful consent and how do we educate users.

   So in the look first launch of the product, we provide more clear language explanation of what this product is, how it works, and we provide much clearer control and consent to how we're using your data.

   And one thing that we    one risk that also came up to the four of the five points was around facial recognition technology. So we use visual search a lot in this, it's like our lens product. You may have come across it before. And people are very worried that we were going to be using biometric technologies for this.

   We don't use biometric technologies, we're using a similar image to image matching service. So if this picture is available in the open web, it's indexed, they will be able to find you a similar copy of that, but we don't know who that person is and we're not taking a photo of Charles Bradley and saying I know that's Charles Bradley and we would return other photos of him. It's purely on a visual and match basis and  explaining that with users.

   And then what information are we using and what data are we sort of storing when we    when this product is being invoked was one of the key points as well. So the whole point of the circling part of it is that users can precisely select a part of their phone that they want to search for, nothing outside of that search is used or collected in the process.

   And what we're doing here is actually turning, if it's an image we're turning it into text and that text is stored as part of your search history, but no other information is stored. So we're not taking the photo of it and storing that photo against your account or anything else. We're just using the text that we've generated on that.

   So this was sort of five really critical things that came up from these engagements, and I think the team had a good sense of some of    some of these issues, but not the level of priority to some of them. And I think the stakeholder engagement we were able to more clearly develop sort of escape hatches or solutions for users which met users' needs. Providing control front and center in the product as you're using it rather than back in a setting or some account profile which is often how products provide you with control over search history and everything else.

   So really fundamentally change the way in which we launch this product and has resulted in a really good launch for us and a product's been used quite a lot over the last    over the last few years.

   So that's one example where we've done a very specific product development thing, I think to some of Fiona's points, we had a very clear deadline. We had a very clear problem statement and scope we were looking to launch this product and ways in which we could build    build it more sustainably and more suitably for users.

   Another example which I'll use just to sort of show you some of the other strategies that we have is our work on AI overviews. So now if you go to search, you may see an AI overview where we generate a response to your query using generative AI and then below that provide you with ten blue links. This is something that we launched about 40 months ago in labs, which is like our service on search. And recently rolled out to over a hundred markets.

   But when doing so, we knew that there were going to be a number of broader challenges to sensitive queries. So things that may not be a very straight line answer to a factual response, then obviously we apply our policies to ensure that we don't trigger an AI overview on something that's policy prohibitive. So if you think about illegal activity or hate speech, et cetera.

   But there are obviously a number of gray area queries where we could with Google voice, in our point of view, provide a less than suitable answer to that. And to do this, we didn't really have a clear sense of what the product strategy should be and how we should    and how we should do this, because it's such a new and evolving space.

   So we built a panel of experts that we are now in the second year of engaging, and we work with on a monthly basis either through one on ones, through online virtual calls, or through in person meetings who are giving us much higher level advice around the product strategy and direction as well as providing clear guidance on when we have quite specific questions to ask them about, whether we should respond in this way or what frameworks we should be using to train our models to respond here.

   I think the benefit of this has been it's such a complicated space and the asking an expert, you know, one or two like one our calls, we would be really underutilizing the expertise of these experts. There was quite a lot of ramp up to build a clear and consistent understanding amongst our experts of what our ultimate challenges were with this.

   How is the model actually being trained and what different strategies do we have within our    in our model and product launch strategy do we have at our disposal. And then going  iteratively across a number of has to looking at different verticals of sensitive queries, stack ranking them and working through some of those strategies has been very, very fruitful.

   And I know that the experts that we worked with in this program have found it very rewarding, because not only can they see some of their works directly being integrated into the product and being launched, we've now had, you know, many billions of queries trigger overviews now. But also they get to sort of learn about the different strategies that we're focusing on and some of their expert work is actually based on this, but they never had the opportunity to integrate that within the business context.

   So there's things within the program, we've done about 30 studies this year and we're sort of focused on a number of areas for next year, and always we can do a better job, but we think we're sort of moving into the right direction to provide clarity over how we integrate experts into product development.

   Pass it back to you, Jim.

   >> JIM PRENDERGAST: Great, Charles. Thanks a lot. You know, it's really you can see how the product development lifecycle did take into account the outside expertise and feedback. I can only imagine your engineers looking at you saying, are you kidding me? You want to do this on the front end?

   But as you said, it probably saved time and a lot of aggravation in the long run.

   For those in the room, we're moving to discussion and question and answer. We do have a couple of microphones up here. I can play Phil Donoghue for your Americans who understand that reference and move the microphone around. There's one on the table. I'll sort of get the conversation going.

   I know Richard you engaged in the chat about an approach to a    a five step approach toolkit that you've developed. For those who aren't in the chat, you want to give an overview of what that is and how folks might be able to access it?

   >> RICHARD WINGFIELD: Yes, absolutely. So the approach is linked in the chat, but you can also find it by using the search engine of your choice and looking up BSR stakeholder five step approach.

   It's a toolkit that we've developed which helps companies think about how to approach stakeholder engagement. The steps are first of all developing a strategy. Basically setting out what you want to do as a company in terms of your    your vision for stakeholder engagement, your level of ambition, maybe reflecting on existing stakeholder engagement. This is obviously something that will vary depending on the resources of the company and what it wants to achieve through stakeholder engagement.

   Secondly, stakeholder mapping. I was talking about the breadth and diversity of stakeholders and undertaking a mapping of which groups or organizations or individuals you need to speak to and maybe where those relationships already exist.

   Third, preparation. This is sort of coming back to some of the points that Charles and others have made around making sure that stakeholders are able to engage in that process with confidence and with understanding of what's happening. So that's everything from building those relationships, thinking about what logistics might be for those meetings. Preparing beforehand for people that come to them and genuinely participate in a helpful way.

   The fourth stage is engagement it stuff and we've provided guidance on how to manage difficult situations. For example, making sure all voices are heard, in dealing with the barriers related to stakeholder engagement related to language or accessibility, for example.

   And fifth, setting out an action plan as to how you're going to use the inputs for that engagement, either to make changes or just to make sure that the people are kept in the loop about what's happening.

   Those are the five steps of the approach and the toolkit is available on fibre link or just by searching BSR stakeholder engagement five step approach.

   >> JIM PRENDERGAST: Thank you, Richard.

   For many countries, Africa is an opportunistic market, it's a growing market where they want to do business but there are unique challenges to it. What you say are challenges for stakeholders across the continent, what are you recommendations in how they might overcome those?

   >> THOBEKILE MATIMBE: Thanks. Africa is a very great place where there's room to engage with Civil Society and what their facing and what the challenges are. But I think the challenge has been really, you know, having more of, you know, will power from the private sector actors to actually want to meet with the community on the ground to be able to engage on key challenges.

   Like mentioned, we host Digital Rights and Inclusion Forum and there are definitely companies who will there will be in the room who have Google in the room, Meta there to be able to engage. But we feel there's a whole lot of other private sector actors who would want to be in the room.

   We really have had several actors that have been able to come through and be able to engage. But I think what is important to highlight is that the environment that we operate in on the African continent, it's marked by repressives governments that obviously have their own calls on companies and they might want to also make certain orders even as well on companies. And that's the kind of, you know, challenging atmosphere environment that companies face when they come on the African continent willing to engage.

   But I think there's a way around it. I think that proactiveness in terms of stakeholder engagement will be able to ensure that even when there's been a challenge and companies have been forced to do certain things or even not to be able to respond according to the policies effectively to certain situations, they can still have a space to engage with, you know, actors on the African continent to say what else can we do and support other forms of strategies that Civil Society actors might be using to actually address some of the challenges that we face.

   So with regards to stakeholder engagement, there's a willing, you know, Civil Society space, and it's open because I think the ways we've been engaging, the format of engagement, they can always be adapted to context. There's room to engage.

   I think what we need to see is more will power from private sector actors to meet the community where the community is.

   >> JIM PRENDERGAST: Great. Thank you very much. That's good insight and good advice. I'm going to look at Fiona and Charles virtually, I'm going to look at you and ask you each, Fiona, from your perspective talking to    well, as a former government official, speaking other governments, what one key piece of advice would you give to them looking to engage and Charles, what advice would you give to private sector entities going down this path?

   >> FIONA ALEXANDER: I might say as I listen to others speak, it's easy or natural if you're a government making a policy, it's natural to be like, I know best. I'm going to sit in my office and talk to my team and I'm going to decide. That's just a natural, I think, human way of thinking. It's really important, though, to take a step back and realize that even though talking to people might take more time and if you're doing a multi stakeholder process, it probably is a little more messy. At the end of the day you're going to get a better product and policy and you'll have buy in if you take the time to talk to people in meaningful way. That's my advice is to take that step back. I don't have my computer in front of me, but you mentioned that Aubrey was on, and it makes me think that stakeholder engagement almost needs ambassadors to make the case as to why this is actually the better way to do policy and make product is to convince people it's the best way to do it.

   I think it's natural to be like, I know best, I'm going to make my own choice. We realize that the outcome of that isn't always the best.

   >> JIM PRENDERGAST: Great.

   Charles.

   >> CHARLES BRADLEY: Yeah, mine, I sort of agree with all of that on the knowing best point. People do know they need to do it, but there's so many other pressures on time and some of the skills needed to be able to do this. I think there's, like, you know, a confidence issue as well with some people who are very familiar with engaging with different internal stakeholders, but not external stakeholders. And a concern about what they might hear or how they might get that feedback.

   I totally agree with Fiona's point around champions. I think the    I think the smartest thing that my boss did was, like, turn this into a formal program at Google with, like, high visibility and structure to it so that we could, like, build champions underneath that program. And champions not just in staff to this program, but also champions in different parts of the business who have utilized the program and delivered greater products.

   We get all sorts of challenges from other product areas as Jim alluded to of you're going to make us to do this beforehand rather than down the line when we know what risks there are or harms there are.

   And I think we've got a bunch of case studies with ambassadors and the inbound requests for this have started to appear and we have a number of sort of    sort of expert engagements in the moment that are underway which came to us saying, oh, I really want to make sure that my product lands in the right way and I know that you're a team that can do that. But you can also do that at pace and within inside the infrastructure of the business. So it's internalizing it and creating a formal program and building champions that you can drive up demand, would be my advice.

   >> JIM PRENDERGAST: Great. So you've created almost a little cottage industry within Google on how to engage on this. Maybe a profit center someday.

   Turning to the audience, I know we have a question here. If anybody else has a question, let me know. I don't have eyes behind my head, I don't think we have any there.

   Just to be fair, please identify who you are and if you have a question directed to one of our panelists, just let them know. Thanks.

   >> LINA: Thank you, everybody. I work for search for common ground, which is an international peace organization, but I co chair the council on tech, which builds peace holders, academics, policy influencers to influence tech design for social cohesion.

   And listening to this panel, like each of you are saying things that are true. But I feel like there's some other truths that also need to be put on the table. And then I'm curious to hear what are some of your thoughts about those, right? And I want to say, Charles, that just dovetailing from where you left it, the rest of the industry has completely depleted trust. Teams, it's extraordinary that you built it up and you said that your senior leadership is actually trying to continue to incentivize this.

   This is the first thing I want to say, there is a deincentive for this kind of engagement even when organizations are bringing forth the harms to these companies, right?

   They're basically saying it's not going to be prioritized over profit. We're looking for growth, we're looking for engagements. And to you, Richard, I wonder if you also feel like there is a real changing narrative in the sort of business and human rights space when it comes to big tech today.

   And that the things that are really leading to most of the changes, again, not in any way excluding Charles' excellent examples of how you've made change, but in most cases, the changes that the tech companies are making in their products is due to litigation, fear of fines, reputational damage, and things like that. And even with really good multi stakeholderness, the companies are not interested in making these changes.

   I'll go one step further, in Africa there's places where these companies are even trying to damage the representations of organizations that are pointing out the harms of these products.

   They're using money to fund other groups that may be saying what they want to hear, and they're actually, you know, damaging the other    the other organizations that are being more critical. So even with multi stakeholder engagement, there's something that's going really wrong when we look at big tech. And it's why, and I'll end with this, that we still see a number of products that whether it's the chatbots, whether it's the new defined things, there's a whole range of products coming out on the market each week that are not doing an upstream test on safety, that are not being transparent.

   And without the transparency, again, what kind of stakeholder engagement are you really looking at, right? When you ask people for the consultations, you're not subsidizing them to give you all those consultations, right?

   I'd like to hear from the panelists, are we recognising we're in a different time here and with the skills in consultations, there's still a real issue on the table here.

   >> JIM PRENDERGAST: All right. Who wants to go first on that one? Maybe Richard, you want to take it from the high level?

   >> RICHARD WINGFIELD: Yeah, I'm happy to.

   I'm hesitant to generalize too much by saying that all technology companies do or don't do something. I think there's huge variation in terms of maturity and attitude towards the importance of being a responsible business, but some taking that responsibility a lot more serious for sure. But I can understand why there's a feeling that the overall    overall the sector still hasn't done enough on this. And still isn't doing enough.

   I think one of the real challenging things, I don't have a solution to this, is that meaningful stakeholder engagement takes a long time. And it requires organizations to be brought in at a very early stage. If you're a company with potentially thousands of different products that might be developed, some of which would never make it to market, you often don't know until the late stage which ones are ultimately likely to launch or not and by that point it's difficult to bring stakeholders in unless you want to exhaust them by constantly asking them about all of the different options that there might be at every single stage.

   And of course technology moves so fast, and that when you've got companies and we think about generative AI and the rush for companies to make sure that they are leading on this is a new technology, you know, bringing in stakeholder engagement shows the process down. That's not to say that we shouldn't do it, but I'm just  saying that there are ways that the technology sector faces unique challenges when it comes to meaningful stakeholder engagement, because it does run contrary to a number of other business interests.

   I think the solutions to that, one of these are silver bullets, one is regulation. We're seeing more regulation, particularly in the EU, such as the digit services act, AI act, corporate sustainability directive.

   Second is to make it easier for stakeholders to become engaged. And that might mean more sector focused engagement. For example, at BSR we're doing a human rights impact assessment into generative AI which is across the sector's entirety. Not just individual companies, but working collectively to try to reduce the amount and the demands on them.

   But there are some of those, you know, what you kind of call disincentives that are hard to work around for sure. I'm not going to pretend there isn't a problem there. But I do think there is huge variations still in terms of the approach that different companies take with some doing it better than others, for sure.

   >> JIM PRENDERGAST: Anything to add?

   >> THOBEKILE MATIMBE: I would just say that I think from my earlier reflections, what I mentioned about will power on the part of companies to meaningfully engage is something that we still as an organization are looking forward to experiencing more of.

   And more specifically as well, engaging where the community is, especially at community convenings is something that we'd love to see as well gain more traction.

   I think one thing that I'll say is that what we've experienced, especially with engagements with the private sector company, is definitely those few that are willing to engage with us and that we engage with, it's usually engagements that have, like, side meetings, closed meetings and it's not out there where we are engaging with broader communities that we represent or that we, you know, we support and stand for. It's more of, okay, who are we going to engage with on the continent? There's this organization and that organization.

   But proactive stakeholder engagement is looking further than that and saying, hey, if you're reaching out to Paradigm and say hey, Paradigm, you should be working on the African continent, we'd like to meet the community, where can we meet the community and we open it up to broader actors on the continent, I think they would be much more enriching conversations around the challenges that the communities are facing with regards to products, as well as more, better inputs into how to shape policies even more big tech companies as well.

   I think what we'd definitely love to see is more interest in, you know, engaging, especially where the community is, meeting the community where it is, the broader community and not cherry picking those organizations that if we stay, we wait in the room, we engage with Paradigm, we've done our part.

   But we need it to be more meaningful and address the concerns of the broader community on the African continent.

   >> JIM PRENDERGAST: Thank you. Charles, obviously you can't speak for the industry, but what's    what's your take from the Google standpoint?

   >> CHARLES BRADLEY: Yeah, I mean, I'm glad that it's been raised.

There's    if it was very easy and if it was all going swimmingly, we probably wouldn't have our jobs trying to do this.

   I think there are two ways of seeing some of the more, like, harsh government actions over the last few years where, you know, the increasing regulation which has been, like, welcomed and important to ensure that, you know, decision making about the way in which products are developed and deployed to users is much more democratic and organized by nation states and regional bodies is being reinforced. We've seen by the stakeholder engagement the different national governments and regional bodies have done that.

   As well as, as you mentioned, of that has been that we can either continue to wait for these fines and more    more regulation that may or may not be fit for [?] or we can engage more with stakeholders to ensure that our products are more aligned with the expectations and the values that we're trying to inhibit.

   And that's got a lot of traction at the leadership level and that's not true for everybody company. We're fortunate to take a long term view on this. But it's been a part of the business's DNA for a long time to engage with stakeholders and bring that expertise into product development. I think we're just getting much sharper at doing so in a more meaningful way.

   By actually showing impact at the product level, there are hundreds of success stories where products have never ever been launched because we have spoken to external stakeholders and experts who have given us very clear guidance on what the risks would be, which are way beyond the threshold that we were able to accept.

   But internally we didn't see those    those issues, we didn't understand the trends there.

   So I think there's    there are different viewpoints from different businesses, obviously our viewpoint is that with the increase of a harder government action in this space, we're going to end up with greater need for stakeholder engagement and building trust and safety into our products rather than the opposite where we're racing to get things out the door to get products to market.

   >> JIM PRENDERGAST: Thanks to all three of you. I'm looking around the room to see if there are any other questions. I'm not seeing any hands. Fiona.

   >> FIONA ALEXANDER: I might just respond a little bit to this one as well, because I think it's important and I get the perspective that you're bringing, but there's no universal solution and there's not a single path that will fix all of these things.

   All companies are slightly different, all products are very different. And not all products or policies are equal in terms of their purpose and their impact. This is why frameworks and the one that I think Richard mentioned, this is why frameworks can be useful, because you can talk about how to implement those frameworks and incentivize action and how to get to that.

   I will say that the idea of regulation is going to solve all these problems, I think it's slightly misguided. I think we've seen a lot of regulation eminent from Brussels in the last five years and it's unclear what the implication of that will be, how damaging it's going to be, how effective it will be or if it's going to be good. I think the Jury's out on all of that.

   I think if GDR is example, it's not going to help at least coming from Europe. But again, we'll wait and see.

   And I think a lot of this culturally depends on sort of where you come from from a    from my perspective, a policy and regulatory perspective, but even I guess from a company perspective, engagement perspective. And this gets back to the post, do you deal with something once something's out and there's a proven problem, or do you look at something, map out all the potential things and then decide whether do it or not. I think that goes for products as well. You have to decide what's your risk factor and what you're willing to do and not do. I lot of that comes from culturally where you are.

   Western philosophy and European approach are very different and not the same. And I think a lot comes out of that.

   >> JIM PRENDERGAST: Thanks. Frameworks plus impact assessment, Aubrey put in the chat. The combination of the two will yield some effective outcomes for sure.

   I don't see any questions online or in the room, so maybe just a quick sentence or two as a wrap up to sort of bring us to a close. I know everybody's got busy schedules and if I could give ten minutes of your time back, you could probably use it to get in line for the restrooms before the rest of the sessions wrap up.

   Let's go to you, Charles. Why don't you kick us off.

   >> CHARLES BRADLEY: Yeah, I think just, you know, this is such an important topic and one that I think needs to get    we need to move to very specific, you know, good practices and frameworks that can be used across industry. And I'm really glad that Richard and the BSR team are doing that, you know, for the industry.

   And we sort of tried to bring the whole industry along on this    on this journey. It's not going to go away. It's going to get more and more complicated.

There are going to be more unintended consequences or unforeseen utilizations of new technologies as the pace increases. And you know, I'm excited that this is a space that continues to be a space where we can learn from each other and build some sort of, like, common understanding of how this is done well.

   So that our colleagues from Civil Society and academia are not being asked to provide input into things that don't go anywhere.

   >> JIM PRENDERGAST: Thobekile Matimbe, please.

   >> THOBEKILE MATIMBE: Thanks. Thanks a lot. I think my last reflection is really that I think going forward I think we are open as part of the initiative to engage as well as, you know, connect any, you know, product designers to, you know, the broader community on the African continent, within our networks, and of course the digital rights and inclusion forum is a multi stakeholder platform and looking to have at least 800 stakeholders from the global south to be in attendance. It's a great place to continue these conversations and a perfect platform for any policy consultations or any other product launches.

   Yeah, I think it's something that we look forward to and look forward to building lasting relationships as well with the private sector around human rights, so to speak.

   Thank you so much for the opportunity.

   >> JIM PRENDERGAST: Great, thank you. Fiona.

   >> FIONA ALEXANDER: I think my takeaway from all of this, it's important to always talk about anyone and everyone, the importance of getting stakeholder feedback. I think we talk a lot about the successes of the process, but the fact that sometimes it doesn't work or it doesn't work and you don't release a product, we don't talk about that, right? I think it's equally as important to talk about why things don't work or if the outcome of the stakeholder feedback is not to release the product, making that known.

   Because the more transparent we can all be in all of this, I think the better it will be for everyone.

   >> JIM PRENDERGAST: Thank you. And Richard, you want to finish it off for us?

   >> RICHARD WINGFIELD: Yeah. I just want to kind of say as well that, you know, although there is still so much more to do and it's right that expectations increase and demands on companies continue to be ones that call for them to do better, you know, we are a lot further advanced than we were ten, 20 years ago in terms of this issue being one that's on the radar of companies and on the sophistication of existing efforts.

   This huge variation, there's a lot more to be done and I think some of the criticisms have been rightly called out today. But I do think it's something that companies are aware of and thinking about in a way that they weren't ten plus years ago.

   And there are opportunities there to kind of use that and to use other tools to increase what we do. So I hope that things will continue to improve, but there is, as you say, still a lot more to be done.

   >> JIM PRENDERGAST: Great. Thank you very much. And I'd like to thank everybody who found our workshop room tucked over here in the corner and also for those who joined online and for our speakers, it's unfortunate you couldn't be here, but here in spirit and here in sight. Once again, thanks, everybody, for joining us and enjoy the rest of your week.