Skip to main content
Back to writing

AI & Society

We're starting to sound like our algorithms

I watched a TED talk by linguist Adam Aleksic that got me thinking. His premise is interesting: we’re beginning to sound and think like the AI tools and algorithms we use every day. We’re caught in a feedback loop where we unconsciously adopt AI language patterns.

One example is the word “delve.” It’s now showing up more in everyday speech simply because AI overuses it. If you’ve noticed yourself or colleagues using words like “delve,” “crucial,” “landscape” or “foster” more often than you used to, that’s not a coincidence. Those are patterns absorbed from AI-generated text that has saturated professional communication.

Beyond vocabulary

This goes deeper than word choice. Aleksic points to how Spotify essentially willed the “hyperpop” genre into existence by labelling a cluster of listener behaviour, prompting artists to create music to fit a category that didn’t organically exist before. The algorithm didn’t discover a genre. It created one. And musicians responded by making music that fit the label.

The same dynamic plays out in professional contexts. LinkedIn’s algorithm rewards certain patterns of writing. AI-generated content follows certain structural templates. People read that content, absorb the patterns and reproduce them. The feedback loop tightens.

The survivorship bias problem

Aleksic’s biggest takeaway is that these platforms aren’t neutral. They prioritise what’s visually provocative and profitable, not necessarily what’s real. If we’re not asking ourselves why we’re seeing or saying certain things, we risk viewing the world through a lens shaped entirely by algorithmic survivorship bias.

For anyone in a leadership position making decisions about AI adoption, this matters. The AI tools your teams use don’t just produce outputs. They shape how your people think about problems, communicate solutions and frame decisions. The influence is subtle enough that most people don’t notice it happening.

It’s a compelling reminder to stay intentional about the information and language we consume and reproduce. Especially when the tools producing that language are becoming embedded in every part of how we work.