Techsocietal

855 posts

Techsocietal banner
Techsocietal

Techsocietal

@TechSocietal

A social enterprise working to reduce digital inequalities & digital harm, promoting safe digital inclusion and freedoms online for the women & children.

Katılım Mart 2021
39 Takip Edilen267 Takipçiler
Techsocietal
Techsocietal@TechSocietal·
Africa’s digital space is growing rapidly — and so is the need for stronger, context-aware content moderation systems. Recent developments following Meta’s decision to end its content moderation contract with Sama in Kenya have raised important questions about how harmful content in African contexts will be identified, reviewed, and addressed at scale. This is about more than technology. It is about user safety, trust, and effective platform accountability. The ACHPR’s Resolution 630, adopted in March 2025, also raised concerns about declining human moderation and the increasing reliance on automated systems that may struggle to understand African contexts. Our publication, “We Found No Violation: When Harm Speaks, and Platforms Don’t Understand,” highlights how gaps in moderation can weaken trust in reporting systems and leave harmful content unaddressed. When harmful content is consistently overlooked: 📍 Victims may lose confidence in reporting systems 📍 Harmful behaviour can become normalized 📍 Trust in digital platforms and safety mechanisms weakens As Africa’s digital ecosystem continues to grow, there is an opportunity for all stakeholders to strengthen collaborative approaches to online safety Key areas for action include: ✔️ Invest in African language moderation systems ✔️ Build local, culturally aware moderation teams ✔️ Ensure complex cases are reviewed by human experts — not just AI ✔️ Building responsive reporting and accountability mechanisms A safer digital future for Africa requires systems that are not only scalable but also context-aware, inclusive, and responsive to local realities. #OnlineSafety #ContentModeration #DigitalRights #TrustAndSafety #AfricaTech #PlatformGovernance #InformationIntegrity #TechPolicy #DigitalSafety #Techsocietal
English
0
0
0
6
Techsocietal
Techsocietal@TechSocietal·
A recent CNN investigation revealed a platform where non-consensual intimate images and videos of spouses were viewed over 62 million times in a single month. This is not an isolated US case. It is a global warning—including for Nigeria. When digital platforms operate without enforceable guardrails: ⚠️ Harmful content spreads at scale ⚠️ Victims are re-traumatized without recourse ⚠️ Abuse becomes both normalized and monetized As digital adoption accelerates, so does the risk of platform-facilitated harm. Without proactive measures, the same patterns of abuse could scale quickly in Nigeria and other emerging tech ecosystems. What needs to change: 🔹 Government: Enforce stronger digital protection laws and adopt safety-by-design principles for platforms 🔹 Platforms: Build detection tools that prevent harm proactively, not just react after reporting 🔹 Clear reporting and takedown systems with enforceable deadlines 🔹 Real consequences for platforms that fail to act The internet should not be a sanctuary for abuse. Stronger accountability and safety-by-design principles are not optional—they are the foundation of safe digital spaces. #TechGovernance #OnlineSafety #DigitalRights #EndIBSA #ProtectWomen #PolicyReform #TechSocietal
Techsocietal tweet media
English
0
1
0
42
Techsocietal
Techsocietal@TechSocietal·
According to the Nigerian Communications Commission, 11–16-year-olds are highly engaged with digital devices, with 93% using phones or devices regularly. A large majority (85%) browse and chat on social networking platforms daily, while 48% watch videos online every day. While this reflects increasing access to digital opportunities, it also highlights the importance of guided and balanced use of technology during childhood. Research has associated excessive screen exposure in early childhood with developmental challenges, including reduced attention span, weaker memory and learning outcomes, and difficulties with social interaction. During early development, children benefit most from real-world interaction, play, and communication—not prolonged screen exposure. What Parents Can Do: 🔹 Set daily screen limits based on age 🔹 Avoid screens before bedtime to protect sleep and brain function 🔹 Encourage offline activities like reading, play, and social interaction 🔹 Monitor content and use parental controls where appropriate Technology should support your child’s growth—not replace the experiences they need to thrive. #ChildDevelopment #DigitalSafety #Parenting #ScreenTime #OnlineSafety #TechSocietal #HealthyKids
Techsocietal tweet media
English
0
0
0
20
Techsocietal
Techsocietal@TechSocietal·
We are asking whether the platforms, regulators, and policymakers responsible for protecting Nigeria's digital consumers will respond with the urgency this moment demands — or wait until the harm scales further before acting. This commentary is informed by @TechpointAfrica's investigation into the vendor onboarding processes of Glovo and Chowdeck, published in 2026. Techsocietal works on digital rights and platform accountability through advocacy, research, and capacity building. Read the full commentary at the link in our bio.
Techsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet media
English
0
0
0
22
Techsocietal
Techsocietal@TechSocietal·
A recent investigation by @TechpointAfrica exposed how easily Nigeria's food delivery platforms can be infiltrated by fraudulent vendors. But there is a dimension of those findings that has not received the attention it deserves — one that sits at the heart of what it means to participate safely in Nigeria's growing digital economy. It is the question of digital trust, and the accountability gap quietly eroding it. Read the full commentary at the link in our bio.
Techsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet media
English
1
0
0
16
Techsocietal
Techsocietal@TechSocietal·
In 2025, the Online Safety Community of Practice grew into a leading space for advancing policy-focused conversations and collective action on safer digital ecosystems across Africa. Throughout the year, the community convened a series of expert-led sessions exploring some of the most pressing issues shaping today’s digital landscape. Key discussions focused on: ➡️ Advancing Safer Digital Spaces through accountability, responsible technology, and cross-sector collaboration ➡️ Centering Afro-Feminist perspectives to make gender data visible, protected, and actionable ➡️ Strengthening Data Justice through improved access to platform information and transparency in Africa’s digital landscape As the landscape continues to evolve, the Online Safety Community of Practice is moving beyond dialogue toward measurable action and long-term impact. Through its Member-Led Clinics, community members are creating collaborative spaces to examine emerging challenges, exchange practical experiences, and develop context-responsive solutions informed by local realities and professional expertise. Why this matters ✅ Transforms operational and lived challenges into practical learning and action ✅ Strengthens the capacity of stakeholders across the ecosystem ✅ Encourages trusted, solution-oriented collaboration among diverse actors ✅ Supports the development of more effective and context-aware digital safety interventions and policies By bringing together diverse stakeholders, the community is helping to shape stronger digital governance conversations, drive accountability and support the development of safer and more inclusive online environments across Africa. #OnlineSafetyAfrica #CommunityOfPractice #DigitalJustice #TechSocietal
Techsocietal tweet media
English
0
0
0
18
Techsocietal
Techsocietal@TechSocietal·
The cultural dimension of image-based sexual abuse. "I don't want my family to know. Please, they will crucify me." This statement, highlights a painful reality: many survivors of image-based sexual abuse fear judgment from their own families more than anything else. Even when help is available, cultural stigma and shame create barriers that keep survivors silent. They choose to suffer alone rather than risk being rejected by their family. As organizations, community leaders, and individuals, we must ask ourselves: Are we creating spaces where survivors feel safe to speak up? How can we shift cultural narratives that blame victims instead of perpetrators? What role can we play in fostering family environments built on support, not shame? This isn't just about legal justice—it's about social justice too. We're building a community committed to changing this narrative - MyLawbrella. #OnlineSafety #CulturalChange #IBSA #SurvivorCenteredApproach #CommunitySupport #VolunteerWithUs
English
0
1
0
22
Techsocietal
Techsocietal@TechSocietal·
Posting students' photos online may seem harmless—but it can create digital trails that expose them to real-world tracking and targeting. Every image shared carries hidden details that can be pieced together. How This Happens: 🚩 Real-time posting: Photos shared instantly can reveal a student's current location 🆔 Visible identifiers: School uniforms, crests, or landmarks can help outsiders identify the exact school 📱 Unmonitored device use: Students or staff may share content that exposes routines, schedules, or access points When combined, these details make it easier for bad actors to track or target students. How Schools Can Protect Students: ➡️ Avoid posting content in real time—share after events ➡️ Limit visible identifiers like school logos and surroundings ➡️ Set clear digital safety rules for staff and students ➡️ Train students on safe online behavior 𝑺𝒕𝒖𝒅𝒆𝒏𝒕 𝒔𝒂𝒇𝒆𝒕𝒚 𝒅𝒐𝒆𝒔𝒏'𝒕 𝒔𝒕𝒐𝒑 𝒊𝒏 𝒕𝒉𝒆 𝒄𝒍𝒂𝒔𝒔𝒓𝒐𝒐𝒎 𝒐𝒓 𝒂𝒕 𝒕𝒉𝒆 𝒔𝒄𝒉𝒐𝒐𝒍 𝒈𝒂𝒕𝒆—𝒊𝒕 𝒆𝒙𝒕𝒆𝒏𝒅𝒔 𝒐𝒏𝒍𝒊𝒏𝒆. #DigitalSafety #ChildProtection #OnlineSafety #DataPrivacy #Safeguarding #ProtectStudents #TechSocietal
Techsocietal tweet media
English
0
0
1
42
Techsocietal
Techsocietal@TechSocietal·
Meta recently revealed that optional end-to-end encryption on Instagram will be removed, with changes taking effect from May 8, 2026. That means even chats that were previously encrypted will revert to standard, platform-readable messages. The decision is linked to increasing pressure on platforms to better detect and respond to harmful content, particularly abuse and illegal activity. But this move comes with serious trade-offs. Without end-to-end encryption: - Private messages may no longer be fully secure - Messages can be read by Meta, shared with third parties, or exposed in breaches - User data becomes more exposed to monitoring and potential misuse - Vulnerable users, especially women and young people, face increased risks of surveillance and exploitation Here’s the trade-off they’re not talking about: in trying to improve safety, is privacy being weakened? What Needs to Happen: 1. Meta must be transparent about how user data will be accessed, stored, and protected 2. Strong safeguards must be put in place to prevent abuse of user information 3. Policymakers must enforce privacy-first standards in platform design 4. Civil society must continue to hold platforms accountable Through our Online Safety Community of Practice (CoP), we are building a space where tensions between safety and privacy are not just discussed, but examined with evidence and translated into practical responses. This moment highlights why the CoP exists: to bring together regulators, civil society, researchers, and industry actors to assess real platform decisions, understand their impact—especially on vulnerable users—and push for rights-respecting alternatives. Interested in joining? Be part of a growing network of practitioners shaping stronger, locally grounded online safety solutions across Africa. onlinesafetycop.org
English
0
0
0
65
Techsocietal
Techsocietal@TechSocietal·
This Labour Day, we celebrate the people who power our mission—those working behind the scenes and on the frontlines to build a safer, more inclusive digital world. Thank you for the unwavering dedication you bring to this work every single day. Your commitment is creating a future where everyone can connect, learn, and thrive online safely and with dignity. Happy International Labour Day! #happyworkersday🇳🇬
Techsocietal tweet media
English
0
0
0
7
Techsocietal
Techsocietal@TechSocietal·
The digital generation needs digital solutions 💙 Young people overwhelmingly prefer reporting cases of image-based sexual abuse ONLINE rather than in person. Why? Because digital tools provide: A sense of safety and control Anonymity when they need it Freedom to express themselves without judgment Immediate access regardless of location This doesn't diminish the importance of offline support—it enhances them. We need both. Digital platforms break down barriers of distance, transportation, time, and social pressure. They meet survivors where they are, literally and emotionally. #DigitalJustice #OnlineSafety #IBSA #YouthEngagement #TechForGood #SurvivorSupport #VolunteerOpportunity
Techsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet media
English
1
0
1
7
Techsocietal
Techsocietal@TechSocietal·
Your image belongs to YOU. Period. 📸🛡️ But in a world of deepfakes, leaks, and online manipulation, that ownership can be challenged in seconds. Sextortion and other forms of image-based abuse are real—and the “post-regret” moment can be overwhelming. But you don't have to face it alone. It's time to move from feeling vulnerable to being equipped with the knowledge to defend yourself. Our course, "Picture Perfect: How to Defend Against Online Image Troubles!", empowers you to: 🚩 Spot manipulation tactics before it’s too late 🔒 Set digital boundaries that protect you 🛡️ Respond effectively if your images are misused 🆘 Access the right support when it matters most Your digital safety isn’t luck—it’s knowledge. 🔗 Enroll in Techsocietal Academy today and take the course, "Picture Perfect: How to Defend Against Online Image Troubles!" Link in bio. #TechsocietalAcademy #DigitalSafety #StopSextortion #OnlineSafety #PrivacyRights #DigitalWellbeing #ProtectYourselfOnline
English
0
0
0
41
Techsocietal
Techsocietal@TechSocietal·
She can't shape AI from the offline side of the digital divide. Today marks International Girls in ICT Day; Yes, we celebrate girls. But celebration without action is noise. 23% of Nigerian women still have no internet access. Not slow. Not expensive. None. Our report breaks down why: gender-blind policies that talk about "digital inclusion" but forget to ask whose inclusion? Let this day be more than a hashtag. Let it be the day we decided to build a digital Nigeria that actually works for every girl. 🔗 Read the full publication here: tinyurl.com/5wa8x2xs
English
0
1
0
57
Techsocietal
Techsocietal@TechSocietal·
Today is International Girls in ICT Day — a day to recognize what girls can build when given the tools — but here is the reality. 📉 23% of Nigerian women are offline. 📉 Our ICT policies? Mostly gender-neutral. Our report, "Bridging the Gender Digital Divide," breaks down what's missing — and how community-centered connectivity can address it. Girls can't shape the digital future if they can't log on. Let's build policies that include everyone. 🔗 Read the full publication here: tinyurl.com/5wa8x2xs
Techsocietal tweet media
English
0
0
0
23
Techsocietal
Techsocietal@TechSocietal·
Content moderation isn’t just broken — it’s biased. At hashtag#DRIF26, Techsocietal hosted a panel titled “We Found No Violation: When Harm Speaks and Platforms Don’t Understand.” The conversation highlighted a critical gap in today’s digital ecosystem: where harms are occurring, being reported, and yet still not acknowledged by platforms. Key takeaways from the session include: 💡Harmful content often slips through moderation systems, especially when there’s no text, or when local languages and coded expressions are used. 💡Reporting mechanisms are ineffective, slow, or unclear, leaving victims without real remedies. 💡Platforms prioritize engagement and profit over safety, amplifying harmful content instead of curbing it. 💡Women and children are disproportionately affected, with online harms often escalating into offline consequences. So, the question is: 👉 How do we promote digital inclusion—especially for women and marginalized groups—while addressing the very real harms pushing them offline? The way forward is clear: ✔️ Contextual and localized moderation systems ✔️ Stronger policy frameworks and enforcement ✔️ User-centered safety design and accessible reporting tools ✔️ Investment in African datasets, languages, and research ✔️ Collective action across civil society, governments, and regional blocs Because when platforms say, “we found no violation,” what they are saying is: “we don’t understand the harm.” #DigitalRights #OnlineSafety #ContentModeration #TechAccountability #PlatformAccountability #DigitalInclusion #InternetSafety #TrustAndSafety
Techsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet mediaTechsocietal tweet media
English
0
0
0
15