The new feudalism

#Evgeny Mororzov, Autor von "The Net Delusion"

Evgeny Morozov war am 3. März 2017 zu Gast im ORF-DialogForum. Der folgende Text ist eine leicht gekürzte Abschrift seiner Keynote.

It’s a pleasure to be here and to be speaking to all of you in these interesting and somewhat challenging times, especially on a topic such as this one. Many of us are starting to rethink—especially now that we have Donald Trump in charge of the White House—, suddenly even we in Europe start realizing that maybe delegating so much power over data to American companies may be not such a good idea now that Obama is no longer in charge. Now we have a very different president with a very different agenda.

I think ultimately the challenge that many of us had over the last decade in thinking about technology, has been our inability to think about it historically and politically. Mostly we have been thinking about it philosophically. Every time you start a conversation about Google, Silicon Valley, digital platforms, the conversation quickly turns into a very shallow round of accusations. Whether you are a technophobe or technophiliac, or you just hate technology and science or you love them: ultimately this is not the correct path to take. The reason why digital technologies, big digital firms in Silicon Valley have become such controversial issues has very little to do with this abstact, philosophical discussions about the nature of our own attitude towards technology but has everything to do with the nature of our current political and economic situation.

What do I mean by that? I’d like to argue that there is a very particular tendency in our society, especially among those who are in charge—political elites, business elites, cultural elites—, to think about digital technology, especially big technology firms coming from America, as being able to solve three big crises, three big contradictions that our societies have run into. Those crises are well known to every one of you: a crisis of democracy, a crisis of capitalism and a crisis of the welfare state.

Now that we only have a very limited amount of time today, there is no way for me to speak at length about all those crises but if you have read the newspapers recently or if you have listened to politicians you might have got an idea as to what the nature of the promises are. There are very few politicians in Europe, even less in America, who do not celebrate the power of technology to create new jobs, to give us more flexibility, the power of UBER and AirBnB to give us ways of earning money, the power of many other services like Facebook to stay in touch and become thus part of a new global international cosmopolitan village—but it reaches far deeper than that. If you listen to the promises made in specific domains like education or health care, you would hear that there are expectations that, thanks to artificial intelligence for example, it will be possible to cure cancer, it will be possible to come up with preventive schemes that tell patients what they should be eating and what they shouldn’t be eating. Through constant monitoring it will be possible to fix many of those problems, it will be possible to finally do something about climate change, thanks to sensors we will finally be able to monitor our resource use and thus optimize all of our energy consumption.

You can run through a list of every political and social problem facing us today, from the crisis of democracy to the crisis of the welfare state, and you will see there is a ready-made solution made by some big technology firm in Silicon Valley, able to resolve it very cheaply, and all that is required of us is to acquiesce to constant, permanent, ubiquitous collection of our data.

Why politicians love the deal
That’s the bargain that we strike. And the reason why I emphasize things like the crisis of the welfare state so much is because the reasons why those platforms and those technologies and services are so attractive to us, is because they come for free. We do not have to pay for many of these services. Every time you use something very basic like e-mail, you use Google’s gmail, you use Yahoo or Microsoft, you use a search engine—all of this comes for free. Of course, once you start looking underneath what actually powers those platforms, you realize that it is not offered for free, it’s offered heavily subsidized. And the subsidy comes from advertising.

What the big firms have understood is that there is quite a lot of money to be made by collecting our data, finding people who are eager to reach us with some advertising message, arranging that exchange and more or less offering the services that allow them to monitor what we do at a very large scale. So in the past it was just limited to search and e-mail, now it can be extended to virtually everything that we do, as long as it is mediated through a screen and some sensor (...).

There is also a very easily understandable reason why such a model appeals to policymakers who find themselves under conditions of utmost austerity— they have no money to spend on services— so naturally they reach out to these firms. It makes those policymakers look innovative, very progressive and it makes them look as if they are in touch with reality and do everything to accelerate innovation.

The hidden side of the process
I think we need to understand that there is a hidden side to this process. This hidden side has to do with the fact that this very simplistic model that I have just described to you, where we basically enter a bargain with these firms: We give them our data, they show us an ad and they offer us a service. That transaction is not as simple as it seems, it has politics of its own behind it. It has to do with the fact that every time we use such a service, every time we give our data to these firms, it shows an ad and offers us a service, they don’t just discard that data, they don’t just send it to the national security services in the United States—although many of them do just that—they do much more with it. They take this data and actually use it to train their own systems of artificial intelligence. It’s actually us, the users of these platforms, who are training the likes of Google, Amazon and Facebook how to do things automatically. We are training Google’s cars how to navigate the city and how to become autonomous. It is us who are training Google’s models how to analyze patient’s health records in order to detect cancer. It’s us who are training all of those systems of artificial intelligence through the consumption of the services that those companies offer.

You might say that this is not a big deal: it’s good, we are doing something good for the world, and all these companies are acquiring this knowledge. Ultimately, what’s to fear about that? I would argue that we have a very poor analysis of the power relations behind this exchange because ultimately what is happening, is that just five American firms, and one Chinese firm—Baidu—, have managed to create a system where, thanks to artificial intelligence, they end up being the only layer of society with the resource to the most precious service that is going to reshape how society operates.

To me it sounds quite obvious that within ten years’ time we will not have any truck drivers driving trucks. There is no reason to have actual truck drivers once you can have autonomous trucks. And we already have autonomous trucks. The reasons why they are not being released, has mostly to do with the political repercussions with doing this right now when there is so much discontent about inequality and automation. If you actually look at the number of people in those industries that stand to lose from automation, that number is immense. Just in the United States alone truck drivers actually happen to be the most populated occupation in the country—there are something like 3.5 million truck drivers who might lose their jobs. And worse, they might be the ones contributing to the disappearance of jobs because they are the ones training those systems how to basically automate their jobs out of existence.

A bunch of firms will dictate our lifes That is an aspect of contemporary digital capitalism that we fail to understand. Once society reshapesand reengineers itself along that model what we will end up with is a bunch of firms, five or six, who can dictate their terms to society and who are no longer tied to advertising as the primary model of making money. Ultimately the advertising bubble is going to crash one day or another. The moment a tiny part of the Chinese population stops buying German cars, a huge chunk of the online advertising market just disappears—it’s that simple. The advertising market online is not very sustainable and it is very easy to disrupt it as many of you have seen if you have followed the debate about ad-blocking. There are all sorts of threats that come up and can destroy the advertising market, or at least make it smaller or less reliable.

Big firms like Facebook have nothing to fear in this fight because they have developed this resource which they can use as they wish. They can charge us what they wish. So when our own health services, health ministries, insurance schemes run out of money—which is a very likely scenario given the aging population and the overall environment of austerity as well as the inability or unwillingness of the rich members of society to actually pay taxes because they can use tax havens—we end up in a situation where our own bureaucrats, our own civil servants have little choice but to turn to these firms, invite them in, sign a deal and have them process for example all of our health records.

This is not science fiction, this is already happening in the United Kingdom where the National Health Service, facing a giant fiscal crisis almost on a monthly basis, has actually invited Google to analyze the patient records of more than four million people in order to identify and detect the signs of early kidney disease. That is done exactly with this system of artificial intelligence Google has developed.
Of course, Google might be doing this particular project for free to generate publicity and free client work but there is absolutely no guarantee that it is going to be free five years from now when all these services are going to be converted into rentextracting businesses that seek to set up monthly or yearly contracts with the public institutions. The reason why I believe we have to start thinking about all of these developments from the perspective of feudalism is very simple: Because ultimately we are ending up in a society where we see a new important resource coming up—data.

It's about the data
Before, we had land, labor and money—those were three very important factors of production. If you wanted to build something you had to strike a balance of some sorts between land, money and labor. You needed to hire somebody, you needed some capital, and probably you needed some land to grow something and to produce something. Nowadays, if you want to be competitive you need to be able to operate with access to this fourth factor of production: data. The entirety of the transition from feudalism to capitalism was trying to find mechanisms and ways how you can destroy institutionalized barriers that stand in the way of getting freer and easier access to core factors of production. This is why we didn’t want feudal lords who controlled all the land and stopped any productive use of it outside of their control. This is why we wanted workers to have strong rights so that they could actually be able to bargain with whoever employs them to have a decent lifestyle. This is why we created all sorts of institutions for allowing entrepreneurs to borrow money, to develop some kind of creative relationship and invest that money into creating an enterprise etc (...)

We need to be able to understand how we can build a society where the most important, precious resource that we have—which in this case is data that artificial intelligence gave rise to—remains in a publicly accessible mode where all of us, whether we are entrepreneurs, whether we are cities or municipal communities, can actually come and do something with that data.

If we cannot reform how the system currently operates, we will end up in a situation where ultimately our lives will be extremely precarious. We wouldn’t know whether Google would like to revoke access to certain data it holds about us or not. Most of that is not regulated by any means of normal laws. It is regulated by private laws, it is regulated by terms and conditions most of us accept while scrolling through that window without ever reading what those terms and conditions are. We are moving into an environment where there is very little that is permanent and that’s why it becomes very hard to plan. Just think about Amazon which now has become the most powerful reseller of books: one day you might wake up scrolling through your E-reader and discover that one or two books are gone from that E-reader because Amazon has decided that something is not right with how you have been reading that book—which has actually happened ironically with Geroge Orwell’s 1984 which Amazon has removed from the Kindles of many users because it caused an error.

This should make us think immediately that what we are doing when we are transacting with those firms we are not entering just a simple relationship, where we are buying things, we are rather entering a service relationship, a contract. It is a contract that the other party can revoke whenever they want to, given how currently those agreements operate. We are entering an environment where everything is offered as a service. What does it mean to access privacy as a service? Privacy through a service is very easy: you pay a monthly fee, 5 euros, 10 euros or 50 euros (if you want to be really private) and your information disappears from all of those databases that are currently haunting you. Maybe nasty Google results are no longer on the first page but on the tenth page because there is a company that charges you a fee and knows how to clean up Google results. Maybe your data is no longer leaking from your phone because that fee guarantees you access to an app that will protect you.

Services instead of rights
A lot of firms actually do offer privacy as a service and a lot of privacy activists who have forgotten the language of politics and economics actually endorse it. Because they do think that the market combined with smart technologies can solve that problem. That’s why I think many of our problems lie in the fact that we are no longer capable of distinguishing between privacy offered as a service and privacy as a right. When it is offered as a right, then there is no pressure on us to enter a commercial service to get it. Ultimately the game of those firms is to offer everything as a service: private security as a service, insurance as a service. Everything that has been offered through a framework of rights inscribed in the constitution, protected and guaranteed through institutions of the welfare state and legal institutions, is now being slowly unraveled to be offered in a highly privatized manner which of course will not be visible or evident to many of us because it will not be called privatization— it will be called digitization, which invokes much more positive attitudes.

That requires us to rediscover the language of politics, the language of economics and to understand that whenever we speak about issues like privacy it is no longer enough to only focus on the legalistic side or the technological side but we have to be able to draw the conclusion that the moment we surrender our data—even if the data is relatively well protected—and this data ends up creating an artificial intelligence system that ultimately steals our job and furthermore offers us things that we used to take for granted as constitutionally guaranteed rights as a service, it’s no longer just a problem or question of privacy, it’s a problem of the economy. It’s a problem of the very crisis that many of us are experiencing in other dimensions of our life that has to do with certain conditions democracy, the welfare state and capitalism are in. (...)

Right now that agenda is not anywhere on the table. They are not even thinking about it, while the most precious resource we have and that will actually power how we will live our lives in a world that is struggling to cope with drastic climate change is being surrendered. I often use that example that we currently find ourselves as Native Americans who discover that they can make a very appealing deal selling valuable land they have to these European colonizers who come and offer us some diamonds or pieces of gold while we are retreating further and further into the heartland until we actually discover that we are surrounded by these corporate giants, there is nothing left to be traded and all of our land is gone. I think we find ourselves in a similar situation with data, we find ourselves in a similar situation with infrastructure. Unless we manage to articulate an agenda which will not be just empty rhetoric like ‘they are taking our data’ or ‘they are taking infrastructure’, but an agenda that will actually be linked to listing positive, radical actions to remake how society functions, how we pay for work, how we guarantee rights, how we organize ourselves to defend ourselves not just from feudalism in the digital sphere but the encroaching feudalism elsewhere. If we don’t do this I think we don’t even have a chance to success.

I think the struggle is not lost yet, a lot can be done, but we need to be able to repoliticize the current debate and try to think a little bit harder about the economics and the politics and try to see through the rather positivistic, utopian language that those firms have surrounded themselves with. Unless we do that I think we will find ourselves in a very profound crisis, not just of politics and economics but also a very profound spiritual crisis.

The new feudalism, #Evgeny Mororzov, Autor von "The Net Delusion" abspielen
The new feudalism
#Evgeny Mororzov, Autor von "The Net Delusion"
Gedenken in Echtzeit, #Stefan Ströbitzer, Leiter der Programmentwicklung abspielen
Gedenken in Echtzeit
#Stefan Ströbitzer, Leiter der Programmentwicklung
Öffentlich-rechtliches TV - auch im Web, #Thomas Prantner, stv. Direktor für Technik, Online und neue Medien abspielen
Öffentlich-rechtliches TV - auch im Web
#Thomas Prantner, stv. Direktor für Technik, Online und neue Medien