I Declare a Data Emergency. #DataEmergency

Cables on the Underground
Standard

Because we need usable data to help us address our global crises.

We’ve had the climate emergency and the biodiversity emergency. We know that the planet is changing as a result of human activity over the past two centuries and that the resulting conditions are unlikely to be amenable to the continuation of human civilisation or – in extremis – our species, along with several million others.

The earth will be fine, of course. We’re not going to topple into the sun or kill the magnetic field or do anything that will stop our planet circling the sun without us for a million or billion years. It’s just the fragile biosphere that seems likely to change rather significantly for the worse, at least from our perspective.

In the space where I spent most of my time, the space in which we tend to assume a future in a hand-waving automagical sort of way, where we speculate about the impact of low-latency ultrafast 5G networks and ML systems sophisticated enough to emulate certain features of biological intelligence, we have our own emergency, but so far we’ve been reticent about naming it or calling for unified action. Perhaps it’s time.

So I’ve decided to declare a data emergency, alerting us to a state of affairs that could end up doing significant damage to our chances of retaining a viable biosphere with enough species to sustain the food web we rely on, as well as turning our dreams of online utopia into a surveillance nightmare of social control, societal breakdown and individual misery.

Just as biodiversity is multi-faceted, so there are many elements to the data emergency.

I want to highlight two: the danger of not having the data we need to limit the impact of climate change and species loss; and the danger of allowing the abuse of data to distort the economy, politics, and our lived experience so that we end up not having the capacity to do anything about climate change and species loss.

The second gets a lot of attention already, and over the last five years we’ve seen the warnings about data monopolies, surveillance capitalism, poor data management, and personal data leakage turn into news stories, official inquiries, and apology after apology from companies whose entire business model relies on extractive data capitalism. Shoshana Zuboff (https://shoshanazuboff.com) , Evgeny Morozov (https://www.evgenymorozov.com/), Lilian Edwards (https://www.lilianedwards.co.uk), Aral Balkan (https://www.ar.al) and many others have written eloquently about the reality and the future risks.

We know how data fuels the advertising-driven online economy, but as machine learning systems become ever more integral to the operation of late stage capitalism the data needed to fuel the neural networks, first training the models and then acting as the object of their singular attention, comes from each of us and the devices we caress, the systems we interact with, the sensors in our homes, streets and offices, and of course the collections of historic data that have accumulated in databases across the world. We are increasingly dependent on access to data but those who exploit it resist attempts to control or regulate that access.

It is clear that we need to do something about this, but just as the fossil economy cannot accept that its reserves are worthless since if they were to be extracted and consumed they would tip the environment beyond the point where the very businesses they are supposed to sustain would have either customers or profit, so the brokers of the data economy resist the reality that their way of working cannot be sustained.

 

The abuse of data in search of both profit and social control leads inexorably to refusal, regulation, or revolution and neither the Californian ideology identified by Richard Barbrook and Andy Cameron (https://medium.com/@bruces/the-californian-ideology-by-richard-barbrook-and-andy-cameron-1995-c50014fcdbce) nor the Chinese model of state surveillance seem likely to endure.

The question is not whether but when, and how. Data reform is coming.

However it is not enough to ensure that data cannot be used for evil, because there is good to be done with those same bits, and it needs to be delivered as quickly and effectively as possible. We need to understand the world well enough to slow, halt, or even undo the effects of the industrial revolution on earth’s climate and the impact of our species and its rapid growth and desire for resources on the biosphere. Unfortunately the information needed to guide future action in the right direction is too often locked in corporate databases (if it is held in a structured way at all) behind walls of contract and copyright and secrecy so that it can be exclusively deployed in the pursuit of private benefit.

We need to unlock it. Not necessarily as open data but as usable data, under licences that make sense for all parties but have the wider global interest at their core. Not freely to all, but in ways that acknowledge the risk of sensitive data about critical infrastructure being abused by bad actors while providing a way for good actors to be accredited and authorised. Not as a static data dump but as a dynamic image of a changing and adapting world, a point cloud that can be used to image reality in ways that might allow us to take collective action to avert the coming disaster.

We know this needs to be done. From innovations like the environmental information broker AMEE (https://en.wikipedia.org/wiki/Avoiding_Mass_Extinctions_Engine) back in the early years of the century, through the work of Professor Andy Hopper at Cambridge Computer Lab, whose research project Computing for the Future of the Planet (https://www.cl.cam.ac.uk/~acr31/pubs/hopper-engineeringchange.pdf),looked at how energy management, sensor networks and open source software could support work to reduce the impact of global warming, there have been many attempts to use data to inform our response to climate change.

Now Gavin Starks (www. dgen.net) , founder of both AMEE and the Open Data Institute, has a new project called Icebreaker One, seeking to bridge the data gaps operate at a much larger scale and “collaboratively create ways to publish both public and private information online that will enable people to find and use it for both public and private good.”

Their call to action says:

We declare a Data Emergency
We need usable data to help us address our global crises.
Icebreaker One is part of our Emergency Response.
We believe there is a global convergence of desires whereby the collective intelligence of our people and machines can be brought to bear to solve these converging crises of our time.

http://IcebreakerOne.org

and see https://icebreakerone.org/2019/05/13/discussion-paper/ 

We need initiatives like this to succeed if we’re to have any chance of avoiding a catastrophe. And we need to take all aspects of data emergency seriously if we’re not to throw away all the many benefits that could come from the data economy that we have constructed over the last fifty years of networked computing.

Perhaps we can make some progress by the ARPANET’s half-century in October 2019?

Arpanet 50 Badge