You open your phone.
You scroll the feed.
You think you’re choosing what you see.
But behind every swipe is a system making choices for you, what’s visible, what’s hidden, what’s trending, and who disappears.
The truth? The feed is not neutral.
It’s coded with bias.
And for many African creators, that bias feels like digital erasure.
How the Algorithm Actually Works
Social media platforms run on algorithms, mathematical systems that decide what content you see based on what they think you’ll like, click, or engage with.
But what we’re rarely told is this:
Algorithms reflect the values of the people and systems that built them.
They’re not objective. They’re trained on:
• Biased data
• Popular behaviours (often Western or white-centric)
• Culturally dominant aesthetics
• Commercial priorities over community realities
So even though the feed feels personalised, it’s actually predictive and that prediction is rooted in power.
What Gets Erased
When the algorithm is designed to prioritise certain content, it automatically suppresses others—especially:
• Non-Western languages and captions
• Black and African political content
• Dark-skinned creators in beauty and fashion
• Posts about protest, injustice, or activism
• Rural or non-metropolitan content from African regions
Your content might be powerful.
Your voice might be bold.
But if it doesn’t “fit” the model, it won’t be seen.
This isn’t a tech glitch. It’s digital inequality.
Why It Matters for African Creators
Because we’re already battling structural erasure offline. Now it’s happening digitally too.
• Our cultures get misrepresented
• Our voices get silenced
• Our platforms deprioritise our communities
• Our stories are labelled “less engaging” because they’re “less relatable” to global audiences
The algorithm becomes a border. A gate. A filter that decides if you deserve to be seen.
And if we don’t question it, we end up performing for the algorithm instead of creating from truth.
How to Reclaim Digital Visibility
- Understand Algorithmic Bias
Educate yourself. Algorithms are built by people. And people have bias. Know what gets rewarded and what gets suppressed. - Diversify Your Platforms
Don’t rely on just one app to tell your story. Build outside the feed: newsletters, communities, and direct-to-audience tools give you more control. - Post With Power
Use bold captions. Educate your audience about suppression. Tag and support other creators affected by bias. - Demand Transparency
Push for platform accountability. Join campaigns. Ask: Who gets recommended? Who gets paid? Who gets erased? - Build Afrocentric Tech
It’s time to design systems that reflect our languages, aesthetics, and rhythms, not just copy others.
The Feed Reflects the System
We were taught the internet was free and open. But the truth is: it’s curated. Structured. Filtered.
The algorithm is not neutral.
It’s political.
It’s racial.
It’s cultural.
And if we don’t challenge it, it will continue to control the narrative and delete those who don’t conform.
Want to Learn More?
• Read our [Platform Power Glossary]
• Sign up for Claim the Code workshops on digital sovereignty
• Follow @ClaimTheCode to unlearn the algorithm and reclaim your narrative
We create. We resist. We recode.
Because being seen should never depend on a machine that doesn’t see you.



Leave a comment