Fifteen years ago, a new phrase entered the digital lexicon: Web 2.0. At the time it sounded futuristic and progressive, ushering in a series of dimensions that would shape the modern internet. Social media, podcasts, blogging and user generated content emerged from the primordial information soup, which had been little more than a collection of globally accessible but ultimately one-way information portals.
Phrases like Web 2.0 will get you thrown out of a chatroom (if such a thing still exists) in 2019. But if it sounds old hat in the age of cloud computing, crowdfunding, virtual reality and 'artificial intelligence', don’t be fooled. We are still living in the same information era.
That's not to say that nothing has changed. How quickly did the excitable fanfare hailing the industrial possibilities of 'big data' evolve to become the war cry of privacy protesters? The social web – touted as the last word in democratisation – is being blamed for generational mental health issues and divisive social patterns. The dismantling of monopolies at the hands of plucky, 'disruptive' start-ups has metamorphosed into an unassailable industrial titan – the FAANGs (Facebook, Amazon, Apple, Netflix and Google).
It's a grim summary.
Predicting the future of technology is a fool's errand. But there is hope. The realisation that people do not change when the technology does should instruct our progress. Is the ability to publish akin to knowing what will interest a reader? No. Can we trust everyone to have an equal understanding of balance when reporting facts? Clearly not. Is publishing about to have its 'vinyl moment' after years of weathering the digital storm? Very possibly.