Rate of Change

The last few months have felt like an unprecedented rate of change. I’m not entirely sure whether that’s because change itself has accelerated, or because awareness of change has.

Changes to X seem to have busted the closed content bubble. I’m seeing far more things I would not have chosen to look for, or was not previously paying attention to, and the exposure is somewhat off the charts.

What’s strange is how effective that is. Even if the content on X is different than it used to be, the brains of long-term users have already been trained to absorb information in a very specific format. When the content changes but the format does not, there is almost no context-switching cost.

A fancier writer would probably cite neuroplasticity, cognitive schemas, and learning acceleration. The simpler version is that standardization increases the rate at which we can learn. Changing formats is expensive. The brain has to reconfigure itself before it can absorb anything.

Computers are not so different. LLMs are not so different either. Deviate far enough from a shared schema or standard and you might as well be on an island, computationally, socially, financially, or otherwise.

I was playing around with isitagentready.com and orank.ai this weekend, and it pushed this into focus for me. Introducing new language or a new schema to non-curious people, or to people with very little time, is high risk. The same is true for LLMs that are optimizing ruthlessly for speed and pattern recognition.

The language we use to educate computers about what is on a website, or inside a product, feels like it is changing meaningfully. For a long time, we have mostly been limited to SEO, Open Graph, and relatively minor schema updates. That no longer feels sufficient.

New ideas spread faster when they arrive in familiar formats.

Discover more from Ben Milne

Subscribe now to keep reading and get access to the full archive.

Continue reading