Is it time for post digital thinking again
I had loads of great conversations in Melbourne a couple of weeks ago while evokeAG was on. One of them was about how I have been noticing changes in my own online habits and how I’m approaching information online these days.
The question I was asking is essentially - In a sea of slop, how do you find the signal in the noise and perhaps more importantly how do you trust it.
The web feels different.
Search results are harder to trust than they used to be, pageant is cooked. Articles often feel interchangeable, because they were assembled from the same source material. The stochastic parrot is real. Social feeds are full of horrific content etc etc.
The phrase AI slop was first coined by Simon Willison and has emerged to describe the growing volume of low quality content generated and distributed by AI systems across the internet. It has become common enough that Macquarie Dictionary named “slop” its word of the year in 2024, pointing directly to the flood of synthetic material appearing online.
The term captures something people are increasingly feeling but struggling to articulate: the sense that the information environment has not only become noisier but is now so noisy and algorithmically manipulated it’s unusable.
This is not just an algorithmic problem. It also reflects a structural shift in the economics of information.
For most of the web’s history publishing involved a level of friction. Writing an article required time and effort. Even mediocre content carried a cost because a human being had to sit down and produce it. Generative AI removes that constraint almost entirely. The cost of producing text has collapsed and the volume of material that can now be generated is effectively unlimited. The cost of publishing to the web is effectively zero.
For the first time the internet is absorbing what happens when information production becomes close to infinite.
Interestingly I think is that people are subtly changing how they discover and trust information. I have noticed it in my own habits and in conversations with others. There is slightly less reliance on search engines, on platforms and an aversion to algorithms and slightly more reliance on people. Newsletters written and curated by individuals with recognised expertise. Podcasts hosted by practitioners rather than publishers. Private Slack groups and WhatsApp chats where people share links and interpretations. RSS feeds, conversations at conferences etc.
The open web is now frequently described as “background noise” and people are moving towards groups of trusted voices to help interpret what matters.
All this reminded me of something I was interested in years ago - the post-digital movement (https://en.wikipedia.org/wiki/Postdigital).
The idea of the “post-digital” began circulating in the early 2000s as digital technologies became ordinary infrastructure rather than exciting novelties. Once computers, the internet and mobile devices were everywhere, the interesting questions were no longer about the technology itself but about how people behaved around it.
Thinkers such as Nicholas Negroponte captured this shift with the suggestion that the digital age would truly arrive when being digital was noticed only in its absence. When technology stops being the centre of attention, the focus naturally moves toward the social systems forming around it.
Looking back, that observation feels particularly relevant now.
For the past two decades the dominant model for navigating information online has been algorithmic filtering. Search engines ranked pages, social feeds sorted content and recommendation systems suggested what to watch or read next. The underlying assumption was that algorithms could help people manage abundance.
But abundance has now tipped into something closer to overproduction. When the supply of content grows dramatically faster than the supply of attention, the problem changes. The challenge is no longer finding information but deciding what deserves trust.
Researchers have already begun examining how large volumes of synthetic content might affect the information ecosystem itself. Some have warned about scenarios where AI systems increasingly train on material generated by other AI systems, a phenomenon sometimes referred to as model collapse (https://arxiv.org/abs/2305.17493), where errors and distortions accumulate as models learn from synthetic data.
Even without that feedback loop, the lived experience for many people is straightforward. The web feels noisier.
And when environments become noisy, humans tend to fall back on something very old.
They turn to trusted intermediaries.
One way to think about the present moment is that the web may be reorganising itself around small knowledge networks. These are tightly connected communities where trust, reputation and shared context act as filters for information. The concept overlaps with ideas from network science such as the small-world network (https://en.wikipedia.org/wiki/Small-world_network), where clusters of closely connected individuals form highly efficient pathways for information flow.
Instead of relying entirely on a giant global information layer, people increasingly rely on clusters of trust. AI does not disappear in this model. It still plays a role in summarising large volumes of material and synthesising ideas. But the authority increasingly sits with people rather than platforms.
In some ways this resembles how knowledge systems worked for most of history. Scholars relied on correspondence networks and institutions. Scientists organised themselves through journals and peer communities. Merchants depended on reputation and relationships.
The early internet briefly replaced many of these social structures with algorithmic discovery.
Now we may be seeing a partial return to something more human.
Agriculture provides an interesting lens through which to observe this dynamic because it has always operated through these kinds of knowledge networks. Farm knowledge rarely moves primarily online. It moves through grower groups, agronomists, consultants, field days and regional conversations.
These are essentially high-trust information environments.
Which may explain why industry podcasts, conferences and in-person discussions continue to matter so much. They remain places where signal can still cut through noise.
There is a paradox here. AI dramatically increases the amount of information available to us, but that expansion may actually make human judgement more valuable rather than less. When content becomes effectively infinite, the scarce resource becomes trust.
Which brings me back to the question I asked at evokeAG.
In a sea of slop, how does anything meaningful break through?
Perhaps the answer is not producing more content at all.
It may be simply understanding how to create the dirt world, personal networks that people trust enough to listen to in the culmination of a post digital world. Lots to think about.