Everyone Has a Story - But the Algorithm Locks Out Listening
The fact you’re reading this means that you probably care about people and what they have to say; you’re curious about the voices of real humans.
Right now, you’re losing that. If you spend any time online, the voices of real people have all but vanished.
They’ve been forced out by ‘the anger algorithm’: the set of formulae that optimise platforms for engagement, because engagement makes money. Content factories worldwide are making shedloads by using AI to create ‘engaging’ posts that make people angry enough to type a reply.
It’s horrible. Destructive. Poisonous.
And it’s all making me angry enough to do some serious thinking. I wonder, can we find a way back to hearing real people again?
In the 1980s, as a teenage trainee journalist, I was told: ‘Everyone has a good story to tell. It’s up to you to help them to tell it.’
That’s what we did in local newspapers. We listened to people and helped to tell their stories in print. Our loyal readers loved it, right down to the details of the wedding bouquets (I can still spell gypsophila). The stories we shared prompted conversations... which led to more stories...
When early social media emerged - CiX and Ecademy in the UK, then Facebook, LinkedIn and Twitter - they helped people tell their stories in their own voices, without a journalist as intermediary. But they stopped.
That degeneration is well documented elsewhere. What’s less understood is what we lost - and whether we can build it back.
What We Lost
Here’s what I mean by ‘what we lost.’
In the Yorkshire market town where I now live, there’s an information gap. We don’t hear about the hyperlocal stuff in between ‘lost parcel’ and ‘tragic loss.’
The everyday texture - what happened when the football ground was flooded on match day - stays invisible. When people don’t know what’s happening, they imagine cliques. They feel excluded. So they don’t make the effort to join in... which leads to organisations struggling to find new members... which eventually leads to groups fizzling out.
This isn’t just a Richmond problem. It’s what happens everywhere when the algorithm locks out listening.
The Pattern
Here’s what I’ve realised after forty years: this isn’t new.
The principle my journalism lecturer taught me - ‘everyone has a story, help them tell it’ - is the same principle that powered the Enlightenment coffeehouses, where ideas spread through conversations. It’s what Clean Language does when it takes people’s exact words seriously instead of translating them. It’s what early social media promised before the algorithm took over.
The principle is simple: create conditions where people can be genuinely curious about each other.
Social psychologist Jonathan Haidt calls this ‘Discover mode’ - when you feel safe enough to be open, to learn, to be surprised. The opposite is ‘Defend mode’ - scanning for threats, preparing to attack or protect.
Discover mode requires specific conditions: physical or virtual spaces, time, safety, permission to be curious.
Defend mode just needs a trigger, and then outrage spreads contagiously.
The algorithm didn’t choose Defend mode because it’s evil. It chose it because curiosity doesn’t scale as profitably as outrage.
Building Small Spaces
So what can we do? We can’t fix the algorithm. But we can build small spaces with different rules.
For years, I’ve been seeking to create the conditions for ‘Discover mode’ conversations. Metaphorum, for example: the Clean Language community’s annual online unconference, running since 2016. Next, I’m building a hyperlocal way to use Bluesky for Richmond.
These are experiments. But then, so were the coffeehouses.
Everyone has a story. Whether we hear them depends on the conditions we create. For you, what needs to happen next?
