The Singularity

Big Data, the Brave New World After Tim Berners-Lee

February 3, 2014

Big_data_cartoon_t_gregoriusKarl Marx viewed the 1849 Gold Rush, which caused the largest land migration in America’s history, as important as the discovery of America itself. He was right, but in ways he could not have imagined. The vast majority of goldbugs found nothing except California, but gave the world Hollywood and Silicon Valley. Motion pictures and TV gave seeming substance to the American Way, rewrote history and sold the American Dream as an aspiration and fact to the US and to the rest. Later, Silicon Valley gave substance to the digital economy, which now gives its masters and commanders the very real possibility to control the way we think, behave, work and consume.

Aldous Huxley, an Englishman who spent his latter years in California, predicted totalitarian consumerism and much else in Brave New World, set in the year 2540 (632 AF, ‘After Ford). But his timeline was pessimistic. We are currently in or around 25 ATB-L (After Tim Berners-Lee), another Englishman. TB-L, then working at CERN in Switzerland – itself an incubator for Big Data – launched the World Wide Web on Christmas Day 1990. His intentions were, and are, honourable, uncommercial and altruistic. The browser concept afforded a window for anyone with a computer to potentially access appropriately tagged and marked up information stored on on other networked computers, anywhere, but within ten years we saw a new California Gold Rush, the Dotcom Bubble.

Most digital panhandlers lost their grubstakes in the early attempts to ‘monetise’ the web but a few, notably Jeff Bezos of Amazon, Pierre Omidyar of eBay and Brin and Page of Google showed that the real power and money resided in internet commerce and ‘intelligent’ search engines. Jimmy Wales, meanwhile, truer to the spirit of TB-L, created Wikipedia, a information exchange of of immense power but with no intrinsic commercial value, except to those who increasingly use it for self promotion. Internet service providers such as AOL made money, of course, and were joined by giants such as IBM and Microsoft’s Bill Gates, initially a web sceptic. Steve Jobs at Apple worked out how to charge money for music downloads then made multimedia smartphones sexy, virtually invented the app industry and thus put networked computers into people’s pockets.

Then came social media. Facebook and Twitter, two of the ten ‘most valued’ internet stocks, were born in 2004 and 2006 respectively. Both enable ‘global communities’ to gossip, pretend they are friends with celebrities, share pictures of cats and be persuaded by increasingly pervasive, often subliminal advertising messages. Facebook’s revenues increased to $2 billion in 2013, for example.

Brave new media enables us to book tickets, navigate by GPS, read and watch the news, download movies and watch and broadcast programmes, including pornography, when and where we like, to reference scholarly – or otherwise – sources, buy and sell stuff and transfer money, but it also creates a digital footprint which tells interested parties what we like, what we are thinking and what we have done, where and when. Leaving aside the argument that this enables a surveillance culture – it clearly does, but law-abiding people seem to consider this a price worth paying for the convenience afforded – the bigger question is who will benefit most from mining the Yottabytes  of coded data generated by human activity in the digital parallel universe?

In the 1949 Gold Rush, the big money was made by those who financed the dream, by those who sold food, liquor, shovels and clothing. Levi Strauss, the tailor, not the anthropologist – founded a clothing empire on indigo trousers for forty-niners, for example.  The digital gold rush is empowered by those casting the best algorithmic spells – principally Google and Amazon at this time, joined by behemoths such  as China’s Alibaba and WeiboJapan’s most popular sites are largely Western. Merchants able to pay the price can position their products and services on the first page on browser rankings, which are in turn aggregated on comparison sites to distill further competitive advantage. The proposition is democratic in principle and Darwinian in practice: the more interest shown by the greatest number of searchers, the higher the ranking. However, the algorithms can create a self-fulfilling prophecy. Bigger may not be necessarily better, but in a world measured by volume of hits, it is.

Just as the World Wide Web was not created to serve commerce, and social media started out ,allegedly, as a bit of fun, Big Data developed out of high minded principles, to observe and analyse patterns in enormous waves of data, such as CERN, the humane genome project, astronomy, meteorology and climate study. As with most great discoveries, its possibilities soon percolated down to commercial activities such as financial modelling, and revealing patterns in what people bought at supermarket checkouts and online. But digital anthropology, mapping future behaviour according to patterns generated by ranking and trumpeting mass behaviour trends, carries the risk of an inherent pathetic fallacy. The medium becomes the message, and the voice of the crowd sings from an increasingly small repertoire on the same hymn sheet. This is true in analogue commerce. Despite the illusion of democratic choice, a few large supermarket chains, financial providers, electronics, auto and entertainment groups control the global marketplace. In the digital world, real choice becomes less, not more.

Machine Learning, endgame of Big Data, seeks to subsume human decision making to infallible logic, not emotion. This Brave New World  is as close as 2040 (TB-L 50). Who will benefit? The usual suspects, joined by the digital sorcerers of Big Data and their apprentices, the snake oil salesmen. That is, if the computers let them.

By John J Kelly and Richard Cross