As a law student and external expert for one of the leading Non-Profit entities based in Kampala leveraging technology for inclusive rights-based society under the Our Voices, Our Futures (OVOF) project, I recently contributed to a policy brief titled Safeguarding Digital Rights: Policy Recommendations to Protect Structurally Silenced Women in Uganda from Surveillance, Privacy Violations, and Data Misuse. The document, developed through legal analysis and focus group discussions (FGDs) with 15 structurally silenced women—Women Human Rights Defenders (WHRDs), women with disabilities, sex workers, and LGBQTI individuals—unveiled a stark reality: Uganda’s digital landscape is a battleground where marginalized women face pervasive surveillance, privacy breaches, and data misuse.
This reflection explores the legal issues, FGD findings, and proposed solutions, weaving together my legal training and commitment to intersectional advocacy to underscore the urgent need for reform. Unravelling the Legal Landscape, the policy brief is anchored in Uganda’s 1995 Constitution, which guarantees equality under Article 21 and privacy under Article 27, prohibiting interference with personal communications without lawful justification.
Yet, these protections are undermined by a web of deficient laws and weak enforcement. The Regulation of Interception of Communications Act (RICA), 2010 permits state interception of communications for national security but lacks clear criteria for court warrants, proportionality safeguards, post-surveillance notifications, and independent oversight.
This violates the principle of legality, a cornerstone of the rule of law which emphasizes transparent legal frameworks. The Data Protection and Privacy Act, 2019, outlines principles like fair processing and data minimization but falls short in implementation, particularly in addressing gender-specific abuses such as phishing, revenge porn, and gendered disinformation.
The Anti-Homosexuality Act (AHA), 2023, further exacerbates harm, with Sections 2(1), 2(3), and 11 criminalizing LGBQTI identities and enabling discriminatory state overreach. I recognized that these legal gaps enable arbitrary state power, disproportionately affecting structurally silenced women. The FGDs with 15 women brought these legal shortcomings to life. A WHRD’s haunting words, “I can’t trust my phone and my shadow; every call feels like a trap,” echoed the chilling effect of RICA’s unchecked surveillance.
The absence of oversight mechanisms, as highlighted by Tomiwa Ilori (2024), fails international standards requiring legality, legitimacy, proportionality, and necessity. Similarly, the AHA’s enforcement, linked to 9 LGBQTI evictions in May 2025 as documented by human rights body HRAPF, underscores how discriminatory laws fuel digital coercion.
The heart of the policy brief lies in the FGDs, which revealed a hostile digital environment for structurally silenced women. Women reported self-censorship due to fears of election-related surveillance, with one participant stating, “I stopped posting about my identity; the state might use it against me.” This fear, intensified by the 2026 elections, reflects a broader erosion of freedom of expression.
A case study of a WHRD targeted by a fake social media account, leading to blackmail, crystallized these finding. Women highlighted telecommunications complicity, stating, “Telecoms help the state spy on our calls and messages,” a concern that aligns with my analysis of corporate accountability under the Advocates Act, where professionals must uphold public interest.
The policy brief proposes a multi-faceted approach—legal reforms, enforcement, and community empowerment—clustered by stakeholder groups to address these challenges. For the government, amending RICA to mandate judicial oversight and revising the Data Protection Act for gender-specific protections are critical. Repealing AHA’s discriminatory provisions, as demanded by FGD participants (“Laws like AHA make us targets”), would curb digital coercion.
Training law enforcement officers on gender-sensitive practices addresses the FGD-reported humiliation of password demands, a solution informed by my study of ethical advocacy. Civil society, led by WOUGNET and HRAPF, should launch digital literacy programs for women, focusing on phishing prevention and encryption. Accessibility for women with disabilities, through sign language and braille, addresses their exclusion. Advocacy campaigns, amplifying FGD voices, can pressure for reforms.
The private sector, particularly telecommunications companies, must adopt transparent data policies and provide encryption tools, countering FGD distrust. Agencies like the Uganda Communications Commission (UCC), and National Information Technology Authority (NITA-U) should develop ethical AI/biometrics guidelines, responding to fears of “AI tracking.” These recommendations, grounded in FGDs and legal principles, aim to empower marginalized women and align with constitutional and international standards.
This project was a profound intersection of my legal education and advocacy. Analyzing RICA and AHA through the lens of constitutional law deepened my understanding of judicial checks on state power. The FGDs taught me the human cost of legal gaps, reinforcing the importance of intersectional advocacy, a principle I explored in my work on feminist technology for the Centre for Multilateral Affairs. As a consultant, I learned to balance legal rigor with grassroots voices, ensuring recommendations are both actionable and inclusive.
The policy brief is a call to action to safeguard the digital rights of structurally silenced women in Uganda. The FGDs illuminated a crisis of fear and distrust, exacerbated by legal deficiencies in RICA, the Data Protection Act, and AHA. As a law student, I see this as a mandate to advocate for reforms that uphold the rule of law and equality. By amplifying the voices of 15 women, this brief chart a path toward a safer, more inclusive digital future, urging stakeholders to act before the 2026 elections escalate these violations.