HER INTERNET LAUNCHES RESEARCH REPORT ON NAVIGATING ALGORITHMS IN RELATION TO STRUCTURALLY SILENCED COMMUNITIES IN UGANDA.

In a digitally interconnected world, social media algorithms play a pivotal role in shaping our online experiences, from the content we see to the communities we engage with. Understanding and navigating these algorithms has become essential to foster visibility, representation, ally-ship, advocacy, organizing and community building in this information era. While we celebrate these opportunities and advantages, algorithms have also posed significant detrimental impacts such as; targeted online harassment, censorship, stoking fear, mental health challenges, limited distribution of content, exclusion, and, misinformation and disinformation that has heavily manipulated public opinion, further influencing laws and policies especially targeting structurally silenced communities such as LGBTQ+ persons according HER Internet’s latest research report.

Titled Navigating algorithms: the case of structurally silenced communities in Uganda”, HER Internet with support from Mozilla Foundation held a launch event in Kampala on Thursday on 04th April 2024, bringing together a total of 25 stakeholders from civil society organizations, activists and researchers. The purpose of this event was to unpack findings of the research report, offer insight into what is informing organizing and community building for structurally silenced communities in Uganda, and, interrogate the extent to which algorithms influence these actions.

In her welcome remarks, Mulungi Sanyu, the Communications and Advocacy Lead at HER Internet stated that while the new reality is that the internet is no longer a luxury but a necessity in this modern world, not everyone can reap from the benefits or equally gain access to the internet especially for womxn in certain regions and communities who experience significant barriers to utilization of its full potential. Sanyu encapsulated the scope of HER Internet’s work following the mission to create opportunities, equip womxn with digital literacy, cyber security information and safety skills to foster a safer online environment as well as representation. She also called for reaffirmation and commitment to HER Internet’s vision of a world where every womxn has the knowledge, opportunities and resources to thrive in the digital era.

A presentation of key findings of this research report highlighted that the study participants were drawn from four regions across the country; Central, Northern, Eastern and Western Uganda in collaboration with partner organizations. A total of 8 Key Informant Interviews (KIIs) and 65 participants in 4 Focus Group Discussions (FGDs) voluntarily contributed to the body of the research. According to the Project Research Consultant, Juliet Nanfuka, social media platforms have demanded and collected data and information in various forms from internet users through consents given by checking or ticking the small boxes attached to the terms of references without thorough knowledge and understanding. This feeds into tools defined as “algorithms” which formed the basis of this study.

Uganda as a country has presented a very interesting space for such studies to take place because of the undefined stance on social media usage. While social media platforms and their importance are appreciated by users online, there has been blockage of various sites, internet shutdowns and restrictions of platforms like Facebook. Juliet explained in her presentation that even though Uganda currently hosts a population of over 40 million people, about 2.6 million equating to only 5.3% of the total population use social media where several decisions being made impact the broader society by shaping narratives, news and information that drive us today. “A lot of decisions being made are a direct outcome of narratives fueled on social media by a very small percentage of people but boosted by the performance of algorithms. So, we need to understand the interplay between what happens in the digital society and its impact beyond. As a country, we are caught in a very interesting space where we are letting external influences dictate how we engage with each other as citizens,” Juliet noted.

The research report which based on extensive data collection and analysis delves into the experiences, perceptions and challenges faced by structurally silenced communities in interacting with digital platforms, particularly social media. Additionally, it brings to light the dual nature of how social media algorithms impact digital rights of privacy, safety and expression stemming from biases and inequalities. It also suggests actionable recommendations and strategies that can be implemented by social media platforms, funders and community-based organizations to influence the workings of algorithms in favor of structurally silenced communities as they advance their respective advocacy efforts in diverse fields. Some of these recommendations include; Improvement in content moderation practices, increased algorithmic transparency, building evidence-based information consistent with concerns emerging from algorithms, and, conducting security assessments that recognize the influence of algorithms as a safety gap. Continuous funding will also contribute in efforts to examine how platforms are reshaping the lives and practices of internet users in restrictive countries like Uganda.

The challenge underlined in this report was language barrier as it was difficult to explain the term “algorithms” to the study participants in the local languages. This complication also affects others outside the digital and technological landscape who also use English as a language of operation which can distort communication. “The word in itself “algorithms” doesn’t exist in any local languages here in Uganda. Language is in a constant state of evolution but what this research shows is that even in the deeper layers of the internet, we need to find language that speaks more to us as individuals outside of the language of silicone valley to better understand and appreciate what is happening in these spaces. Right now, there is vast disconnect in our understanding of just the word algorithms. And as a consequence, there is a vast disconnect in how we engage with those algorithms or understand what they are doing for and against us. Hence, ultimately impacting our community building and organizing in Uganda,” she expressed.

In a session on regional experience sharing and feedback on the research findings, representatives of the study participants  and other attendees shared their experiences in relation to the impacts of social media algorithms and hopes on how this report can be impactful.

A representative from Mbale in Eastern Region shared, “We had a little bit of language barrier which was settled after and the participants were able to express what they were going through with social media and the way forward for the project. The story telling was good because people shared personal stories about what they have seen, what they are going through and the impact even on instant messaging apps like in WhatsApp groups and other online groups. The research study was quite good and left an impact. We hope that the way forward is that HER Internet will continue to partner with different organizations and continue to share knowledge with our community members on the workings of social media platforms and how to communicate.”

A participant from Mbarara in the Western Region, “This research was really important because we got to realize that our community members had fallen victim when it comes to use of social media… Many of the participants had questions about algorithms, why they were being followed by people they did not know and how they contribute to online violence like blackmail which made up most of the testimonies in the room. We also had discussions of most community members being outed because of social media and not because of the physical environment that they are living in. Our recommendation will be that we still need more engagements around social media because it is part of our stress relief as we heard from the participants. They use social media to relieve themselves from stress and other mental health challenges but unfortunately, they end up being victims of circumstances.”

A mobilizer from Gulu located in Northern Uganda said, “Actually, in my region, the experiences are more similar to other regions. We had some victims who were actually arrested for the use of TikTok in retaliation of abuse and attacks triggered by other people on the platform. It is through this research that we got to know that these algorithms exist and the policies raising the questions of how do we display our work there without compromising ourselves and others? Our recommendation is maybe we continue sharing this information especially with those deep down in the villages and bring them on board so that we can move together.

A participant that took part in the FGD in Kampala said, “Algorithms have been very proactive in spreading misinformation and disinformation. We are in the era of TikTok where someone is comfortable putting up a list of different identities with no worries about the consequences of their actions. The stigma and discrimination were also part of the discussion that we had already witnessed or experienced. If they (algorithms) were really in support of us by spreading the correct information, definitely, we would be on the other side of where we are right now because algorithms do not necessarily fact check. We would not be having these issues. With the level of misinformation and silencing of voices, this project is indeed a more timely project. I think that we have been more focused on the physical violence that we don’t notice the extremes of the digital violence that is ongoing. For recommendations, this project is the first step. I also think that silence is not an option as we see where we have ended up. It is possible that we are going to have a trail of collateral damage along the way but in these steps that we are taking, we need to expose the misinformation and disinformation out there so we can create an equal and just internet for us all.”

A KII who also contributed to this study said, “One of the things that we can look at and commend for our way forward is to figure out how do we link algorithms into the existing loopholes that have been provided in the ever-changing policy and legal climate? How do we thrive off the current technicalities and make them work in our favor for us as a community to shift narratives and conversations online? It is through collaboration among organizations and campaigns. We should use our enemy’s arsenal for our own good.”

The Executive Director at HER Internet, Sandra Kwikiriza, gave a speech in which she expressed her pride and gratitude towards donors, partners and staff for their dedication and support to a cause that fuels HER Internet’s work daily despite all the current challenges like the existing oppressive laws and policies. “Whether it’s providing digital literacy training or advocating for the rights of minority groups in the digital space, our efforts are making a tangible difference. Through stories of empowerment and resilience, we can see firsthand the transformative power of digital rights advocacy.” She reiterated HER Internet’s unwavering commitment to empowering structurally silenced communities in advocating for their digital rights and urged more stakeholders to get on board with HER Internet’s mission. “Whether it’s raising awareness, sharing knowledge, or collaborating with us, your contribution can help us amplify our impact and create a more just and inclusive digital world for all,” Sandra stated in closing as she stressed HER Internet’s commitment in advancement of digital rights for structurally silenced communities in the country and across the borders.

A key note speech was made by Isabella Akiteng, an independent Consultant in Governance, Gender Enthusiast and Process Facilitator. Guided by the quote “use the enemy’s arsenal for our own good”, Isabella reiterated that there’s an urgency for structurally silenced communities to embrace and utilize same tools like algorithms which have been harnessed by anti-rights groups to counter their opposition. “The conversation around algorithms can be unique in a sense that they can be beautiful, for all the reasons that they are bad, if we remove the negativity around those algorithms, they make utter sense. They make perfect sense for organizing.” Isabella also drew attention to the role that algorithms plays into influencing the minds and opinions of the general public, justice system and policy makers, raising safety concerns both in virtual and physical spaces as they can create matter of life and death situations for individuals and communities especially in cases where narratives are negative. “If the content around a particular theme is not deliberately built, then the narrative is negative. And therefore, there is a line of threat based on the algorithm. The conversations around algorithms are a matter of life and death conversations as they go beyond online platforms within the context of Uganda. For all the conversations that we can have about their positives and advantages, they can become a death sentence for communities in Uganda,” she stressed. Isabella saluted HER Internet for the investment in a research that makes all the difference within the current context because it provides an alternative narrative from the one that is out there backed by the algorithm. “Now more than ever, theorizing makes sense because this content provides a counter narrative to any other narrative on the spectrum of social media and that’s one of the ways that we pushback by providing and serving the platforms through counter narratives over what the algorithm may provide and instead begin to safe keep.”

Key takeaways from this event included; demand for transparency from platforms regarding algorithmic processes and data usage, awareness and sensitization campaigns to empower users with knowledge about algorithmic functions and strategies for navigating them to make shifts in narratives and content related to structurally silenced communities by increasingly sharing the right information far and wide through the power of partnerships and collaborations. There should also be intensified advocacy efforts for algorithmic fairness, diversity, and accountability prompting increased community engagement in discussions about algorithms and their impact.

Lastly, the existence of clear policies and regulations in regards to algorithms will help safeguard the rights and well-being of marginalized communities including the LGBTQ+ community. This involves pushing for laws that hold platforms accountable for their algorithms and content moderation practices, ensuring that they don’t perpetuate harmful narratives or discriminate against certain groups. To read and download the research report; https://shorturl.at/hjY14