Online safety and awareness
- Equal Lives

- Jan 14
- 3 min read

I recently came across a video clip from a 90s talk show, where Bill Gates talked about the internet. The host joked that we didn’t really need this new invention because we already had radio, TV, and the ability to record shows.
Watching that now-online-feels surreal. It’s easy to forget there was a time when going online felt shiny, exciting, and full of possibility.
Today, the internet is woven into almost every part of our lives. We are connected (pun intended), it helps us stay independent, and opens doors to communities and opportunities we may not have offline. But as much as the internet has become essential, it also comes with risks, especially on social media where algorithms quietly shape what we see.
Algorithms are designed for engagement, not wellbeing. And for Disabled people, who are already more likely to face digital exclusion, online abuse, and algorithmic bias, staying safe online requires both awareness and practical steps.
Understanding the Social Media Algorithm Trap
Spending time online can feel comforting; familiar pages, familiar feeds, familiar routines. But it can also be overwhelming. When you’re tired, lonely, overstimulated, bored, or simply curious, it’s easy to click on something without thinking much of it.
And that’s exactly how algorithms work their way in.
If you interact with harmful or upsetting content - even to challenge it - the algorithm doesn’t distinguish between “this is great” and “this is awful.” It only sees engagement.
For example: You might come across a Facebook post where someone uses the English flag aggressively to intimidate migrants or ethnic minorities. You comment to say you find it unsettling or unacceptable.
To the algorithm, this simply means: “You spend time on this type of content - here’s more.”
The emotional impact of that can creep up slowly. You might start noticing:
more divisive posts
more extreme opinions
more anger in your feed
more content that pushes you toward fear, hopelessness or outrage.
Over time, this can distort how the world feels, even changing what you believe is “normal” or “common” simply because the feed makes it look that way.
For Disabled people looking for community or support, this becomes even riskier. Instead of empowering content, you might be shown:
pity-driven or inspiration porn narratives
misinformation about disability
harmful stereotypes
influencers selling fake “cures”
groups with extreme or damaging views.
No one chooses to fall into these spaces; the algorithm simply nudges you deeper every time you scroll.
Digital Barriers and Safety Risks
Disabled people often face additional obstacles online, such as:
websites without alt text
flashing or overstimulating content
unreadable layouts
unstable Wi-Fi
lack of access to devices
lower digital literacy due to exclusion, not ability.
These barriers make it harder to:
spot scams
navigate privacy settings
report abuse
filter harmful content
recognise algorithmic manipulation
When the starting point is already unequal, the risks increase.
Types of Online Harm
Unfortunately, these risks are not hypothetical:
Cyberbullying and harassment
People who openly identify as Disabled online often face targeted abuse, mockery, or invasive questioning.
Scams and fraud
Scammers prey on vulnerability; fake job offers, phishing, or “tech support” scams designed to steal money, identity, or access.
Data misuse
With AI tools collecting huge amounts of sensitive information, data privacy is becoming more complicated than ever.
Practical Safety Tips
We all know reducing screen time helps but realistically, life is online now.
What matters is giving yourself the tools to stay in control.
Here are some ways to protect yourself:
Use strong, unique passwords and enable two-factor authentication.
Be cautious with links or messages from unfamiliar sources.
Adjust your privacy settings to limit who can see or contact you.
Use accessibility tools like screen readers, Easy Read formats, contrast settings, and browser plugins.
Avoid engaging with harmful content, even to argue, it teaches the algorithm to show you more.
Take regular breaks, especially if you feel emotionally drained, overstimulated, or “stuck” in a cycle of upsetting content.
It’s not about being perfect, it’s about being aware.
If You’ve Been Affected
If you’ve experienced online bullying, scams, or harassment, support is available:
Written by
Kimberly Myhill, Business Manager, Equal Lives
.png)


