From smart machines transforming the nature of work, to an exploration of the failings of distributed capitalism, to a manifesto on how to preserve the human in an information civilization, Shoshana Zuboff's work is a thorough review of the social consequences of information technology.
Her book Master or Slave? The Fight for Freedom and Power in the Age of Surveillance Capitalism is to be released in November 2017.
Here's the second part of our conversation about the rise of big data and the personal information we relinquish under the new digital regime.
To read the first part of our chat, look here.
Your recent work focuses on the concept of surveillance capitalism. What is this?
Surveillance capitalism renders behavior so that it can be parsed as observable, measurable units. Once it’s rendered as behavior it is turned into data and that’s the data that I call 'behavioral surplus’.
These data are subjected to sophisticated analyses to manufacture 'prediction products,' and are then sold into what I call new 'markets in future behavior.' Huge revenues flow from these new markets.
The logic of this new capitalism is that we are raw material resources. We are not the 'ends' of the process but rather the 'means' of the revenue process. It is 'about' us, but not 'for' us.
We’re creating a world of no escape, and this world right now has very little law in it. If you live in Europe, there is more law but not enough. In the US we have very little law. The exploitation of lawless space has been critical to surveillance capitalism’s success.
Where did surveillance capitalism originate?
Surveillance capitalism was discovered in 2001 at Google, and quietly, stealthily, secretly elaborated there over the coming few years until it went public in 2004.
From the time of the discovery of behavioral surplus and the development of its foundational mechanisms in 2001-2002, to the time of the IPO in 2004, Google’s revenue grew by nearly 4000%.
Everybody was looking for that key formula that was going to translate investment into capital, and Google found it first.
Its mechanisms are designed to be unilateral and to operate outside the awareness of the data subject — to be done surreptitiously, to be surrounded by deceit, obfuscation, and secrecy lest anyone grasp what they were doing.
By the time people had begun to grasp it, it had become so successful that anybody who really wanted to make money felt they had to go that way.
This is how the earlier dream of digital capitalism died.
Designers used to imagine an 'aware home' with a simple closed loop between the systems and the homeowner to use the data for their own benefit and under their own discretion.
The NEST contract says 'if you don’t agree to our surveillance terms, then your thermostat can stop working at any time and we will stop upgrading the device and we have no responsibility for making it work, so you have to sign it.' Then your data goes to dozens or hundreds of third parties, and NEST says it is your problem to read THEIR privacy policies, and so on in an infinite regress.
And that’s one thermostat — multiply that by all the devices they want to put in your home.
What’s the independent variable? It’s not technology. It’s capitalism.
You’re talking about a system that’s crying out for reform — how can you remain optimistic?
I like to feel that I’m optimistic because I read history. So, I do take the longer view.
When you go back to the great theorists of capitalism, every one of them acknowledges that capitalism can be hijacked by elite, self-interested forces and driven off a sane path and away from the fundamental unity of supply and demand that once guaranteed its fundamental reciprocity with its populations.
I think that’s where we are now.
The logic of this new capitalism is that we are raw material resources. We are not the 'ends' of the process but rather the 'means' of the revenue process.
But as a society we have experience in successfully taming a rapacious capitalism and tethering it to the needs and values of our society.
It took several decades, but we did develop robust ways of integrating industrial capitalism into our society in a productive way so that it was relatively reciprocal with the requirements of democracy and social well being. (By this I mean the legitimation and protection of collective bargaining, legislation of working conditions, hours, child labor, etc.)
So it is my belief that we’ve done it before so we can do it again. The mountain we have to climb is public opinion and public awareness.
Capitalism has been driven away from itself and away from its healthy evolutionary line and into a kind of toothed bird dead end. It’s up to us to pull it back to serving our genuine needs.
We’ve done it before and we can do it again.
What one policy initiative, one regulation, one piece of legislation would you prescribe to remedy this exploitation?
We have a lot of work to do. We have public opinion and public education to undertake. We have to build up precedents in the courts with various legal cases — and then finally we’ll get to legislation.
But behavioral surplus — their ability to render our action as behavior, turn it into surplus purely for this process of creating prediction products that are sold into these parallel markets — right there is the key, that’s the crunch point.
Making it illegal, making it onerous to exercise that specific mechanism would be an existential threat to surveillance capitalism.
We need to create the conditions in which information capitalism is forced back to the consumer as the one locus for the investment of data for the improvement of the experience or the product. Bring those loops back to the ownership of the humans from where the data is derived.
The surveillance capitalists have claimed for themselves the right to translate our actions into behavior and then convert them to data to create predictions that they then sell to make their money.
That is a self-authorizing right. They gave that right to themselves. They claimed it because there was nobody to stop them. Nobody could see what they were doing, nobody knew what they were doing, and there was no law to stop them.
So that’s where we have to intervene. They do not have the right to claim our reality and our actions for their own revenue streams.