The annual Mozilla Festival #MozFest has been defined as “part art, tech and society convening, part maker festival, and the premiere gathering for activists in diverse global movements fighting for a more humane digital world.”

Mozilla Foundation hosted its first ever regional Mozfest House #MozFest2023 in Kenya from 21st to 22nd September this year in accordance to the global theme The Collective Power of the People bringing together artists, activists, technologists, designers, students, researchers, policy makers and journalists from diverse communities and movements across the Eastern and Southern Africa. Following a range various of panel discussions, the hybrid event interrogated the emerging issues and local approaches related to Artificial Intelligence (AI) alongside other technological advancements, data governance, how movements are core to catalyzing change online, and, existing people-driven technologies at the intersection of tech and society in Africa.

Our Executive Director at HER Internet, Sandra Kwikiriza engaged in a collective panel conversation with Harvey Binamo (Tech Officer at Magambo Network, Zimbabwe), Lawrence Mute (Human Rights Lawyer and Practitioner, Kenya), Weam Shawgi Hassan (Feminist and Gender Defender, Sudan) and moderated by Roselyn Odoyo (Mozilla Foundation). Within this discourse dubbed “Confronting the Margins”, the speakers explored the role of Artificial Intelligence (AI) and technology in deepening and shaping the lived realities of diverse structurally silenced communities, existing opportunities in addressing tech marginalization and finding solutions to creating a better digital experience within the African context.

As the African continent advances and adapts to new innovations in technology at breakneck speeds, it is evident that structurally silenced communities are further left behind hence intensifying technological marginalization. The recurrent state sponsored internet shutdowns due to political tensions, digital illiteracy, limited to no access to digital infrastructure and technologies, unwarranted censorship, surveillance, disproportionate access to social services (education and fair employment) and invalidated lived experiences of online forms of harm which have extended over a long period of time even before the realization of AI, are onerous issues posed in access to information and navigating proper interaction with technology today.

“Marginalization is pervasive. It exists in a continuum of things such as freedom of expression, association, mobilization and building networks. Tech has made it easy for us to organize and mobilize in closed groups as we build networks to strategize on how to make our lives better. But if there is a divide between access to tech and access to information, these things are difficult to do. And if we are unable to organize offline, it’s also getting increasingly hard to organize online because of infiltration from people into these spaces…” Sandra explained in her submission on the implication of censorship, surveillance and other existing forms of online violence as a contributor to technological marginalization of already deeply marginalized communities like the queer community, resulting into the deprivation of their rights to privacy, expression and association both online and offline.

Weam highlighted the uncontrollable increase of state sponsored misinformation, disinformation, hate speech and propaganda especially through social media targeting womxn and feminists who are viewed as a stereotype in Sudan for speaking truth to power in the face of the constant wars, hence leading to unwarranted arrests, detentions and threats of violence.

In light of the issue on right to access to information and accessibility to digital friendly tools, Mute explained the limitation of AI as an impediment which doesn’t acknowledge nor address peculiar needs in regards to accessibility especially in respect to persons living with disabilities. “The problem with AI is that it will sort of try to go to the center. And, the danger with going to the center is that it then assimilates and forces all of us to become the same which becomes extremely a challenge for people with disabilities,” Mute expressed as he further raised caution on the impact of source(s) of information and algorithms on content which undermines difference, “Where is this data and algorithms being prepared from? They are getting the information from society. So, if society, in respect of gender is sexist, homophobic or ableist in respect of disability, if you’re not careful, what then you will have is content which undermines a difference …”

To address tech marginalization, content moderation, language inclusion and review of quality of information online should be prioritized by all key players in the digital and tech sector to limit misinformation and spread of harmful narratives propagated by algorithms which facilitate the marginalization of structurally silenced communities. Algorithms especially on social media negate access to factual information, amplifies misinformation and disinformation as sensationalized posts or content attract the most attention and clicks with little progress in ensuring content moderation and quality of information within posts. Intersectionality among big tech companies, policy makers and structurally silenced groups should equally be taken into account to address the adverse challenges of the widening tech marginalization posed by AI.

Palatable social technologies and innovations ought to be considered and adjusted based on diverse contexts. Also, open and free public engagement between decision-makers and citizens in the bargain will foster collaborations among multiple stakeholders to improve internet infrastructure, encourage access and accessibility across the digital landscape in the continent. Lest we forget, the role of legal and policy frameworks is extremely critical and a means to an end to bridge the increasing technological gap.

To get a glimpse of this conversation, watch here; 

Part 1:

Part 2: