There’s a well known verb declension in the online world
I was reminded of it by Emily Bell’s excellent piece in Sunday’s Guardserver (what else can we call it on Sundays when the masthead says ‘Observer’ and the URL says ‘guardian’?).
It’s well worth reading in full, especially when she points out that:
Facebook, Instagram, Twitter, WhatsApp and what’s next are and will continue to be making editorial decisions on our behalf.
But I don’t necessarily agree with it all. Emily notes:
Accountability is not part of Silicon Valley’s culture. But surely as news moves beyond paper and publisher, it must become so. For a decade or more, news organisations have been obeisant to the power of corporate technology, nodding and genuflecting at the latest improbably impressive magic. But their editorial processes have something to offer technologists too.
But I think she’s wrong here – Silicon Valley is absolutely accountable, just not to the people media organisations generally consider themselves accountable to. Facebook cares about its shareholders and its advertisers, not about whether its users have an informed and balanced view of world events.
It’s also worth noting that Google, Facebook and Twitter all believe that the algorithm *is* exercising the judgment that was previously reserved for human editors, and that while they accept that they are not yet be as nuanced their goal is to built a tool that is so convincing we cannot distinguish between Paul Dacre and Facebook – each will perfectly serve the interests of their owners by displaying a feed of material designed to reinforce prejudice, promote a particular political agenda and keep people coming back for more.
It’s Evgeny Morozov’s ‘solutionism’ again, the belief that processing power and good software can properly solve human-scale problems, or at least do a better job than the clearly imperfect wetware that runs the show at the moment.
It’s also – and here I show my age – the end point of the expert systems debate that has been running since the mid-80’s at least, trying to extract ’knowledge’ from skilled humans in the form of data and rulesets that can be processed to deliver improved results at lower cost and without any messy human-human engagement.
In the end, the flesh will triump because we are imperfect and emotional and driven by systems that we neither understand nor control, but we may be seeing the swing towards a new authoritarianism driven not by evil people but by the more successful adherents of Richard Barbrook’s ‘ Californian Ideology’, captured in the algorithms that shape our screen-based lives.
I can certainly tell you that I, for one, am not ready to love Big Data – whatever horrors await me in Room 1100101.