Check-in and access this session from the IGF Schedule.

IGF 2023 WS #64 Decolonise Digital Rights: For a Globally Inclusive Future

    Time
    Wednesday, 11th October, 2023 (06:45 UTC) - Wednesday, 11th October, 2023 (08:15 UTC)
    Room
    WS 3 – Annex Hall 2
    Subtheme

    Human Rights & Freedoms
    Non-discrimination in the Digital Space
    Rights to Access and Information
    Technology in International Human Rights Law

    Organizer 1: Ananya Singh, USAID Digital Youth Council
    Organizer 2: Vallarie Wendy Yiega, 🔒
    Organizer 3: Man Hei Connie Siu, 🔒International Telecommunication Union
    Organizer 4: Keolebogile Rantsetse, 🔒
    Organizer 5: Neli Odishvili, CEO of Internet Development Initiative

    Speaker 1: Ananya Singh, Government, Asia-Pacific Group
    Speaker 2: Shalini Joshi, Civil Society, Asia-Pacific Group
    Speaker 3: Pedro de Perdigão Lana, Technical Community, Latin American and Caribbean Group (GRULAC)
    Speaker 4: Tevin Gitongo, Civil Society, African Group
    Speaker 5: Mark Graham, Private Sector, Western European and Others Group (WEOG)

    Additional Speakers

    Dr. Jonas Valente*, University of Oxford

    (*The aforementioned speaker is here to stand in for his colleague, Mark Graham from the University of Oxford, who had to tend to an urgent professional commitment.)

    Moderator

    Man Hei Connie Siu, Civil Society, Asia-Pacific Group

    Online Moderator

    Neli Odishvili, Civil Society, Eastern European Group

    Rapporteur

    Keolebogile Rantsetse, Technical Community, African Group

    Format

    Round Table - 90 Min

    Policy Question(s)

    1. What are some colonial manifestations of technology (for example, in terms of language, gender, media, AI, etc) emerging on the internet?

    2. How do we address the colonial legacies that shape what the internet has become and the ongoing colonialism that determines what it will become and what should decolonising digital rights look like?

    3. What role should different stakeholders play in the process of decolonising the internet, technology, and digital arena, and how can we better include marginalised communities in such discussions?

    What will participants gain from attending this session? Attendees will be introduced into the context of decolonisation with reference to the internet, that is, how the internet is being built on the expropriation of labour, resources, and culture from colonised spaces and subjects. Participants will be able to spot the ways that bias is built into our technology and understand that it is not truly neutral. Attendees will discover how algorithms are merely opinions written in codes and how contemporary technological designs draw data from actors, beliefs, and systems that are ableist, sexist, racist, and classist, and hence, perpetuate stereotypes and historical prejudices. They will discover how the internet as a medium of communication and social interconnection can be improved to prevent the reproduction of historical biases. They will comprehend how technology could be made ‘truly’ neutral and an equalising, enabling force.

    Description:

    The internet has long been seen as a forum where everyone is equal and traditional hierarchies are non-existent. But as privileged groups continue to dominate the creation of technology, technology perpetuates historical bias. Tech design reflects off-balance power dynamics and the internet promotes digital colonialism. The entire Global South continues to consume digital content and platforms produced in the Global North. As a result, non-English content removal is slower, regardless of the magnitude of hate or harm. Facebook’s ‘Safety Check’ feature was swiftly activated after the attacks in France yet bombings in Lebanon could not even trigger a response from Facebook. Twitter and Facebook introduce fact-checking for US elections while disinformation remains a problem in Myanmar. Less than 1,000 of 70,000 Wikipedia authors are from Africa. UNESCO has highlighted how voice assistants like Alexa, Siri, reinforce gender biases, normalise sexual harassment & conventional behaviour patterns imposed on women and girls, and put women at errors’ forefront. Hate speech targeting marginalized communities continues to rage online. People from the Global South and/or marginalised communities have the right to feel and be safe online, and to be able to have the same autonomy that users in the Global North do. This workshop will contextualise decolonisation in reference to the internet, technology, and human rights & freedoms online. The panelists will unpack evidence of gender stereotypes, linguistic bias, and racial injustice coded into technology and will explain how apps fit the creators' opinions of ‘what the average user should or should not prefer’. The panelists will recommend how online knowledge can be decentralised and what ideological influences need to be delinked from the digital arena. They will suggest practices and processes that can help to decolonise the internet and make it a truly one, global, interoperable space.

    Expected Outcomes

    Outcomes:

    1. As the internet is the default distributor of ideas and opinions, and the default gatekeeper of knowledge in modern society, it is important that it does not start replicating historical patterns of oppressions and retraumatize maginalised communities.

    2. This workshop will Debunk the idea that technology is neutral and impartial; acknowledge that colonial ideologies, ideas, and impressions continue to influence technology and reinforce stereotypes.

    3. Enable people to recognise patterns of coloniality on the internet and in the technology products.

    4. Start a conversation on how technology and digital space can be decolonised.

    Outputs:

    1. The session’s highlights and key takeaways will be published in a blog series on the NetMission.Asia website

    2. The session discussions will inspire the creation of a learning document that may be shared with wider audiences including but not limited to international donors, media, civil society, private sector, government, and end users.

    Hybrid Format: As the session begins, both onsite and remote participants will be encouraged to scan the mentimeter QR code to express their expectations from the session. Equal time will be given to online/onsite speakers. In case of slow internet connection, the online moderator will first turn off the speaker's video and if the issue still exists, the speaker may use Zoom's dial-in tool. The rapporteur will track the flow of online chat and include relevant points in the session report. Once the audience Q&A begins, online participants will be encouraged to use Zoom's Q&A feature and onsite attendees will be given microphones to ask questions. The onsite and online moderators will coordinate to ensure an alternating pattern of Q&A between onsite and remote participants/speakers. At the session’s end, the audience may provide feedback for the session by scanning the mentimeter QR code.

    Key Takeaways (* deadline at the end of the session day)

    Colonial legacies continue to shape the internet and technology, perpetuating biases and reinforcing historical prejudices. Technology is not neutral and that the internet can replicate patterns of oppression, leading to a greater need for decolonization.

    To truly decolonize the internet and digital space, it is essential to empower marginalized voices and communities. All stakeholders should actively seek out and amplify the voices of marginalized communities in discussions about technology and digital rights.

    Call to Action (* deadline at the end of the session day)

    Tech companies must actively recruit from marginalized communities. Governments should establish policies and regulations that require tech companies to prioritize diversity and inclusion in their technology development teams. Civil society organizations must raise awareness about diversity's benefits. Marginalized communities should actively engage in training and education programs that provide the necessary skills for technology development.

    Tech companies, governments, and civil society organizations should collaborate to create platforms and spaces that value underrepresented voices. Governments should allocate funding for training and education programs that empower marginalized communities to navigate the digital space safely and effectively. Marginalized communities should actively share their experiences & collaborate with stakeholders to promote their digital rights

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    Digital/Platform-based Labor: Precarious Conditions and Unequal Opportunities

    The session brought to the forefront a critical issue: the pivotal role of human labor in the development of AI/digital technologies. It revealed that digital development relies heavily on the human labor located predominantly in countries of the global South, where thousands of workers are engaged in activities such as data collection, curation, annotation, and validation. Companies benefit from these platforms while also often bypassing labor rights and protections such as minimum wage and freedom of association. Further, the precarious labor conditions that plague these workers include low pay, excessive overwork, short-term contracts, unfair management practices, and a lack of collective bargaining power, leading to social, economic, and physical/mental health issues among these workers in the long run. One notable initiative in this regard is the Fair Work Project, which seeks to address these labor conditions in nearly 40 countries. The project assesses digital labor platforms based on a set of fair work principles, including factors such as pay conditions, contract management, and representation. The Fair Work Project aims to drive positive change within the digital labor market and create a fairer and more equitable working environment. Speakers asserted that technology production should not merely be a process undertaken by technologists and coders but should include local and marginalized communities who possess a deep understanding of cultural nuances and the specific issues they face.

    Data Colonialism: Protecting ‘Self’ Online 

    The speaker discussed the exploitation of personal data without consent and argued that personal data is often used for profit without knowledge or permission, highlighting the need for more transparency and accountability in handling personal data. The speakers emphasized how the terms of service on online platforms are often unclear and full of jargon, leading to misunderstandings and uninformed consent. One of the main concerns raised is the concept of data colonialism, which the speaker explained, aims to capture and control human life/behavior for profit. She urges individuals to question data-intensive corporate ideologies that incentivise the collection of personal data, which perpetuate existing inequalities, lead to biases in algorithms, and result in unfair targeting, exclusion, and discrimination. The speaker then suggested that individuals should take steps to minimise the amount of personal data they share online or with technology platforms. They emphasised the importance of thinking twice before agreeing to terms and conditions that may require sharing personal data. They also proposed the idea of digital minimalism, which involves limiting one's social media presence as a way to minimise data. She highlighted that data colonialism is infact a silver lining in the cloud as it provides an opportunity to create systems rooted in ethics. The speaker went on to advocate for the concept of ownership by design, which includes minimisation and anonymisation of personal data. However, she cautioned against an entitled attitude towards data use, arguing that data use and reuse should be based on permissions rather than entitlements or rights. She called for more transparency, accountability, and individual action in minimising data sharing and also emphasised the need for critical digital literacy programmes.

    Digital Sovereignty and Intellectual Property

    The speakers explored the intricate web of internet regulation and its impact on digital sovereignty and decolonisation. Multinational companies are found to subtly impose their home country's laws on a global scale, disregarding national legal systems. Intellectual property, such as the Digital Millennium Copyright Act (DMCA), is cited as an example of this behavior. 

    Gender-Based Disinformation

    Addressing the challenge of gender-based disinformation in South Asia is a central concern highlighted in the session. Women, trans individuals, and non-binary people are often targeted by disinformation campaigns that aim to silence and marginalize them. The session emphasizes the importance of documenting and combating gender disinformation and advocating for collaborative approaches that involve diverse stakeholders.

    Digital Literacy and Bridging the Digital Divide

    One key aspect illuminated by the session is the need for digital literacy programs and skills training to empower marginalized communities. The speakers advocated for democratizing access to digital education and ensuring that training is contextualized and relevant. This inclusive approach recognizes the diverse needs and cultural specificities of different communities, enabling them to harness the power of digital tools effectively. 

    Decolonizing the Internet and Digital Technology

    The concept of decolonizing the internet and digital technology production is evoked as a process that involves not only the use of digital technologies but also the transformation of the production process itself. By incorporating diverse perspectives and local context into technology creation, the aim is to avoid biases and discrimination. The speakers advocated for adapting platform policies to respect cultural differences and acknowledge human rights, rather than solely adhering to external legislation.

     

    Conclusion:

    The journey towards a decolonized internet and technology landscape is ongoing. It requires
    continuous reflection, dialogue, and action. We can all strive for a digital space that respects
    and empowers all individuals, regardless of their background or geographic location. By
    working together, we can create a future where the internet truly becomes a force for
    equality, justice, and liberation.