HER INTERNET LAUNCHES RESEARCH REPORT ON NAVIGATING ALGORITHMS IN RELATION TO STRUCTURALLY SILENCED COMMUNITIES IN UGANDA.

 

In a digitally interconnected world, social media algorithms play a pivotal role in shaping our online experiences, from the content we see to the communities we engage with. Understanding and navigating these algorithms has become essential to foster visibility, representation, ally-ship, advocacy, organizing and community building in this information era. While we celebrate these opportunities and advantages, algorithms have also posed significant detrimental impacts such as; targeted online harassment, censorship, stoking fear, mental health challenges, limited distribution of content, exclusion, and, misinformation and disinformation that has heavily manipulated public opinion, further influencing laws and policies especially targeting structurally silenced communities such as LGBTQ+ persons according HER Internet’s latest research report.

Titled Navigating algorithms: the case of structurally silenced communities in Uganda”, HER Internet with support from Mozilla Foundation held a launch event in Kampala on Thursday on 04th April 2024, bringing together a total of 25 stakeholders from civil society organizations, activists and researchers to unpack the findings of the research report which offers insight into what is informing organizing and community building for structurally silenced communities in Uganda, and, interrogate the extent to which algorithms influence these actions.

In her welcome remarks, Mulungi Sanyu, the Communications and Advocacy Lead at HER Internet said that while the new reality is the internet is no longer a luxury but a necessity in this modern world is, not everyone can reap from the benefits or equally gain access to the internet especially for womxn in certain regions and communities who experience significant barriers to utilization of its full potential. Sanyu encapsulated the scope of HER Internet’s work following the mission to create opportunities, equip womxn with digital literacy, cyber security information and safety skills to foster a safer online environment as well as representation and called for reaffirmation and commitment to HER Internet’s vision of a world where every womxn has the knowledge, opportunities and resources to thrive in the digital era.

A presentation of key findings of this research report highlighted that the study participants were drawn from four regions across the country; Central, Northern, Eastern and Western Uganda in collaboration with partner organizations. A total of 8 Key Informant Interviewees and 65 participants in 4 Focus Group Discussions (FGDs) voluntarily contributed to the body of the research. According to the Project Research Consultant, Juliet Nanfuka, social media platforms have demanded and collected data and information in various forms from internet users through consents given without prior knowledge by checking the small boxes attached to the terms of references. This feeds into tools defined as “algorithms” which formed the basis of this study.

Uganda as a country has presented a very interesting space for such studies to take place because of undefined stance on social media usage. While population appreciates social media, there has been blockage of various sites, internet shutdowns and restrictions of platforms like Facebook. Juliet further explained in her presentation that even though out of a population of over 40 million people, about 2.6 million equating to only 5.3% of the total population use social media where several decisions being made impact the broader society by shaping narratives, news and information that drive today. “A lot of decisions being made are a direct outcome of narratives fueled on social media by a very small percentage of people but boosted by the performance of algorithms. So, we need to understand the interplay between what happens in the digital society and its impact beyond. As a country, we are caught in a very interesting space where we are letting external influences dictate how we engage with each other as citizens,” Juliet noted.

The report which based on extensive data collection and analysis delves into the experiences, perceptions and challenges faced by structurally silenced communities in interacting with digital platforms, particularly social media. Additionally, it brings to light the dual nature of how social media algorithms impact digital rights of privacy, safety and expression stemming from biases and inequalities, as well as, suggests actionable recommendations and strategies that can be implemented by social media platforms, funders and community-based organizations to influence the workings of algorithms in favor of structurally silenced communities as they advance their respective advocacy efforts in diverse fields. Some of these recommendations include; Improvement in content moderation practices, increased algorithmic transparency, building evidence-based information consistent with concerns emerging from algorithms, and, conducting security assessments that recognize the influence of algorithms as a safety gap. Continuous funding will also contribute in efforts to examine how platforms are reshaping the lives and practices of internet users in restrictive countries like Uganda.

The challenge underlined in this research report was language barrier as it was difficult to explain the term “algorithms” to the study participants in the local languages. This complication also affects others outside the digital and technological landscape who also use in English as a language of operation which distorts communication. “The word in itself “algorithms” doesn’t exist in any local languages here in Uganda. Language is in a constant state of evolution but what this research shows is that even in the deeper layers of the internet, we need to find language that speaks more to us as individuals outside of the language of silicone valley to better understand and appreciate what is happening in these spaces. Right now, there is vast disconnect in our understanding of just the word algorithms. And as a consequence, there is a vast disconnect in how we engage with those algorithms or understand what they are doing for and against us and hence ultimately impacting our community building and organizing in Uganda,” she expressed.

A session on regional experience sharing and feedback in regards to the research project, both representatives and other attendees shared their experiences with the research study, impacts of social media algorithms and hopes on how this project can be impactful.

A representative from Mbale in Eastern Region shared, “We had a little bit of language barrier which was settled after and the participants were able to express what they were going through with social media and the way forward for the project. The story telling was good because people shared personal stories about what they have seen, what they are going through and the impact even on instant messaging apps like in WhatsApp groups and other online groups. The research study was quite good and left an impact. We hope that the way forward is that HER Internet will continue to partner with different organizations and continue to share knowledge with our community members on the workings of social media platforms and how to communicate.”

A participant from Mbarara in the Western Region, “This research was really important because we got to realize that our community members had fallen victim when it comes to use of social media… Many of the participants had questions about algorithms, why they were being followed by people they did not know and how they contribute to online violence like blackmail which made up most of the testimonies in the room. We also had discussions of most community members being outed because of social media and not because of the physical environment that they are living in. Our recommendation will be that we still need more engagements around social media because it is part of our stress relief as we heard from the participants. They use social media to relieve themselves from stress and other mental health challenges but unfortunately, they end up being victims of circumstances.”

A mobilizer from Gulu located in Northern Uganda said, “Actually, in my region, the experiences are more similar to other regions. We had some victims who were actually arrested for the use of TikTok in retaliation of abuse and attacks triggered by other people on the platform. It is through this research that we got to know that these algorithms exist and the policies raising the questions of how do we display our work there without compromising ourselves and others? Our recommendation is maybe we continue sharing this information especially with those deep down in the villages and bring them on board so that we can move together.

A participant that took part in the FGD in Kampala said, “Algorithms have been very proactive in spreading misinformation and disinformation. We are in the era of TikTok where someone is comfortable putting up a list of different identities with no worries about the consequences of their actions. The stigma and discrimination were also part of the discussion that we had already witnessed or experienced. If they (algorithms) were really in support of us by spreading the correct information, definitely, we would be on the other side of where we are right now because algorithms do not necessarily fact check. We would not be having these issues. With the level of misinformation and silencing of voices, this project is indeed a more timely project. I think that we have been more focused on the physical violence that we don’t notice the extremes of the digital violence that is ongoing. For recommendations, this project is the first step. I also think that silence is not an option as we see where we have ended up. It is possible that we are going to have a trail of collateral damage along the way but in these steps that we are taking, we need to expose the misinformation and disinformation out there so we can create an equal and just internet for us all.”

A KII who also contributed to this study said, “One of the things that we can look at and commend for our way forward is to figure out how do we link algorithms into the existing loopholes that have been provided in the ever-changing policy and legal climate? How do we thrive off the current technicalities and make them work in our favor for us as a community to shift narratives and conversations online? It is through collaboration among organizations and campaigns. We should use our enemy’s arsenal for our own good.”

The Executive Director at HER Internet, Sandra Kwikiriza, gave a speech in which she expressed her pride and gratitude towards donors, partners and staff for their dedication and support to a cause that fuels HER Internet’s work daily despite all the current challenges like the existing oppressive laws and policies. “Whether it’s providing digital literacy training or advocating for the rights of minority groups in the digital space, our efforts are making a tangible difference. Through stories of empowerment and resilience, we can see firsthand the transformative power of digital rights advocacy.” She reiterated HER Internet’s unwavering commitment to empowering structurally silenced communities in advocating for their digital rights and urged more stakeholders to get on board with HER Internet’s mission. “Whether it’s raising awareness, sharing knowledge, or collaborating with us, your contribution can help us amplify our impact and create a more just and inclusive digital world for all,” Sandra stated in closing as she stressed HER Internet’s commitment in advancement of digital rights for structurally silenced communities in the country and across the borders.

A key note speech was made by Isabella Akiteng, an independent Consultant in Governance, Gender Enthusiast and Process Facilitator. Guided by the quote “use the enemy’s arsenal for our own good”, Isabella reiterated that there’s an urgency for structurally silenced communities to embrace and utilize same tools like algorithms which have been harnessed by anti-rights groups to counter their opposition. “The conversation around algorithms can be unique in a sense that they can be beautiful, for all the reasons that they are bad, if we remove the negativity around those algorithms, they make utter sense. They make perfect sense for organizing.” Isabella also drew attention to the role that algorithms plays into influencing the minds and opinions of the general public, justice system and policy makers, raising safety concerns both in virtual and physical spaces as they can create matter of life and death situations for individuals and communities especially in cases where narratives are negative. “If the content around a particular theme is not deliberately built, then the narrative is negative. And therefore, there is a line of threat based on the algorithm. The conversations around algorithms are a matter of life and death conversations as they go beyond online platforms within the context of Uganda. For all the conversations that we can have about their positives and advantages, they can become a death sentence for communities in Uganda,” she stressed. Isabella saluted HER Internet for the investment in a research that makes all the difference within the current context because it provides an alternative narrative from the one that is out there backed by the algorithm. “Now more than ever, theorizing makes sense because this content provides a counter narrative to any other narrative on the spectrum of social media and that’s one of the ways that we pushback by providing and serving the platforms through counter narratives over what the algorithm may provide and instead begin to safe keep.”

Key takeaways from this event included; demand for transparency from platforms regarding algorithmic processes and data usage, awareness and sensitization campaigns to empower users with knowledge about algorithmic functions and strategies for navigating them to make shifts in narratives and content related to structurally silenced communities by increasingly sharing the right information far and wide through the power of partnerships and collaborations. There should also be intensified advocacy efforts for algorithmic fairness, diversity, and accountability prompting increased community engagement in discussions about algorithms and their impact.

Lastly, the existence of clear policies and regulations in regards to algorithms will help safeguard the rights and well-being of marginalized communities including the LGBTQ+ community. This involves pushing for laws that hold platforms accountable for their algorithms and content moderation practices, ensuring that they don’t perpetuate harmful narratives or discriminate against certain groups. To read and download the research report; https://shorturl.at/hjY14

Navigating Algorithms: The Case for Structurally Silenced Communities- Research Report 2024.

Executive Summary.

At a time when social media has become the backbone of a large part of the digital society, it is important to understand the glue that holds it together. While the different social media platforms offer a range of spaces for networking, research, knowledge generation, kinship, and entertainment, each also offers a distinct value that meet the needs of a diversity of users.

However, many users report a shift in social media from a fun and carefree space to one that is “polluted” and “restrictive” alongside concerns on increasing censorship, harassment, stalking, and discrimination. Often, the violators hide behind keyboards, vague policies and in some cases, vague policies that shield perpetrators more than protect victims.

While social media has become an open house for many, it remains exclusive for some. In Uganda, the cost of data continues to serve an exclusionary function for a population that is yet to meet the affordability target1 – where 5GB of mobile broadband data is priced at 2 percent or less of average monthly income as envisioned by the Alliance for Affordable Internet (A4AI) and endorsed by the United Nations Broadband Commission.

Meanwhile, the weaponisation of social media by the state including through shutdowns2 (Ugandan users currently do not have access to Facebook following its shutdown in February 2021) and through the use of restrictive policies still hinder the full utilisation of online spaces. However, the scrapping of Section 25 of the Computer Misuse Act which defined offensive communication as the “willful and repeated use of electronic communication to disturb or attempt to disturb the peace, quiet or right of privacy of any person with no purpose of legitimate communication” offered some relief to users in the country to regain some level of trust in the use of platforms.

However, this trust is not enjoyed by all in the country. For the Lesbian, Gay, Bisexual, Trans, and Queer (LGBTQ+) community, concerns remain rife due to continued and rising levels of homo-phobic rhetoric and bias online. This most especially in the wake of the resurfaced Anti-Homosexuality Amendment Act (2023).

Public discourse online is characterised by misinformation and disinformation, virality tactics and click baiting which have detrimental consequences that only further subjugate the LGBTQ+ community. Meanwhile, concerns that social media algorithms play into reinforcing this narrative is also high. Algorithms fueled by user behaviors and interactions with content serve to create even deeper channels for narratives to sink into popular culture online and offline.

As HER Internet, it is in recognising these concerns that we seek the investigation into these interactions– of platforms, laws, and users – a necessity, especially as platforms reduce the levels of access to data and as civic spaces online and offline shrink for marginalised communities.

The goal of this report is to offer insight into what is informing LGBTQ+ organising and community building in Uganda and the extent to which algorithms influence these actions. The report gives a background into the general social media landscape in the country, and reviews global trends in algorithmic studies. This report will serve as an entry point for further studies into this arena at a time when social media companies are tightening their grip on data which would otherwise help address the concerns held by marginalized communities. Concurrently, growing concerns on content moderation practices, the increased pace at which online communications travels, and the absence of adequate safeguards – both online and offline – all further reinforce the need to build an evidence base upon which progressive policy interventions can be established and pursued by platforms and policy makers.

We appreciate the support of the Mozilla Africa Innovation Mradi: In Real Life (IRL) Fund through which we have been able to tackle these questions across Uganda. In doing so, we have developed a set of recommendations that we hope will influence change and an appreciation of the role that live human experiences play in informing how platforms can work better – especially for marginalized and vulnerable communities often relegated to the sidelines both offline and online.

To download and read a copy: https://www.herinternet.org/wp-content/uploads/2024/04/HER-INTERNET-REPORT-April-4.pdf