Transcript for:
Understanding Surveillance Capitalism and Its Impacts

Harvard professor Shoshana Zuboff is sometimes called the Karl Marx of our time. Her monumental book, The Age of Surveillance Capitalism, exposed the dubious mechanisms of our digital economy. According to Zuboff, our personal and private experiences have been hijacked by Silicon Valley and used as the raw material for extremely profitable... The term surveillance capitalism is not an arbitrary term. Why surveillance? Because it must be operations that are engineered as undetectable, indecipherable, cloaked in rhetoric that aims to misdirect, obfuscate and just downright bamboozle. All of us, all the time. Connecting them. Don't be evil. Connecting them. Humanity. Don't be evil. Humanity, humanity. It's about empowerment of the individual. The future is private. What really happens with your Facebook photos? Why are there hidden microphones in Google Nest? And Pokemon Go exposed as child molesters? In this episode of Backlight, Shoshana Zuboff reveals how Silicon Valley deceives us so well. All this comes into play when I buy a pair of shoes on the internet. Well, you know, some people will say to me, but Professor Zuboff, I like those targeted ads. They're so useful. Or I enjoy personalized services. Sometimes people will say, you know, I have nothing to hide, so I don't care what they take. Each one of these statements is a profound misconception of what's really going on. We think that the only personal information they have about us... is what we've given them. And we think we can exert some control over what we give them. And therefore we think that our calculation, our trade-off here, is something that is somehow under our control, that we understand it. What's really happening is that we provide personal information, but the information that we provide is the least important part of the information that they collect about us. Thanks to their navigation and search engine, Google knows where we are all the time and what we think. Facebook knows our hobbies, preferences and friends, because they retrieve a lot of information from the digital traces we leave behind unwittingly. Spelling errors in your search terms, which color buttons you prefer, how fast you type, how fast you drive. Residual data. Way back at the beginning, back in the year 2000, 2001, 2002, back in those days, these data were considered just extra data. They were considered waste material. And people called them things like digital exhaust or data exhaust. Eventually it was understood that these so-called waste materials harbored these rich predictive data. The search information we retain, we do for quality purposes. So, for example, the Google spell checker, our did you mean feature that appears on Google, has been built using long periods of data around someone issuing a query and then issuing another corrected query right after that and us learning those corrections. And it actually takes more than 30 days worth of data to build the world-class spell corrector that we have. The companies like to say, we collect data so that we can improve our service. And that's true. They collect data and some of it is used to improve the service to you. But even more of it... is analyzed to train what they call models, patterns of human behavior. So once I have big training models, I can see how people with these characteristics typically behave over time. Time and that allows me to fit your data right into that arc and to predict what you're likely to do not only now but soon and later. This is what I call behavioral surplus. These data streams filled with these rich predictive data. Why surplus? Because right from the start these were more data than was required to improve products and services. Once you have the behavioral surplus, the comprehensive behavioral data of hundreds of millions of people, you can start predicting the preferences of specific groups. Think popular shoes for male managers, or the preferred restaurant of a group of people sharing the same zip code. Maybe they do a prediction where I have dinner tonight. How can I imagine what they will predict about me? Well, you know, at the simplest level, they may predict, the kind of food you're in the mood for right now and then sell that prediction, auction that prediction to their business customers in the restaurant business, who will then send you a very quick ad. We know you're in the mood for a delicious pasta dish tonight. We can invite you to our restaurant and here's a discount coupon. There are some people who are saying, well, it's very improbable that those targeted ads really can achieve something because people have their own will and they will not buy some shoes just because there is an ad in front of them. You know, I think one of the misconceptions that's really important for us to move away from is that Surveillance capitalism is something that is only manifest in our lives when we're online or somehow it's only restricted to online targeted advertising. It's easy for us to say oh these these things don't affect me. The fact is this is being conducted at a layer that is not accessible to us. We have no idea what today's algorithms can predict about us, or what behavioral data they use to do it. A simple thing like buying a certain kind of shampoo can divulge essential information about us. For example, the New York Times reported a case of a supermarket chain that knew a girl was pregnant even before she did, or was prepared to share the news. The market's algorithms discovered that the girl switched from fragrant shampoos to more neutral smelling products. Since the olfactory senses of pregnant women become stronger, the market algorithm assumed this girl must be pregnant. Her father didn't know, until he was repeatedly sent special offers for baby products. Thanks to the analysis of trillions of terabytes of behavioral data that we unwittingly leave around the digital domain, Big tech sometimes knows us better than we know ourselves. They can predict things like our personality, our emotions, our sexual orientation, our political orientation, a whole range of things that we never ever intended to disclose. The predictive value that big tech can glean from residual data is huge. The family photos we post on our Facebook pages. contain residual data from which vast amounts of valuable knowledge can be distilled. Let's say I put my children's birthday party photo album on the web, on the Facebook pages. What we don't understand is that the most important thing there are not the photos per se. It's the predictive signals that these companies can lift from the photos. It's not just my face, but it's allowing them to have the face so that they can analyze the hundreds of muscles in my face. Uploading innocent snapshots on your Facebook page can have unforeseen consequences. Our faces, for example, are used to train algorithms to recognize facial features. And we have absolutely no idea what that facial recognition software... is used for. These data streams with these rich predictive signals are fed into the new factories, the computational factories, analyzed for predictions. ...of human behavior, and these predictions are then sold. Who are they sold to? They're not sold to us. We are not the customers. They're sold to businesses, to business customers, who want to maximize our value to their business, whatever it may be. They use information from our faces, which we've given... Billions and billions of photos to Facebook to train models for facial recognition. Those models are then sold to military operations, some of them in China. And those Chinese operations do many things, including imprisoning the Uyghur, a subset of the Uyghur Muslim population in China, in what is rightly regarded as an open-air prison. Where they actually don't have to have people behind bars because they track and follow them constantly through facial recognition. The knowledge obtained from our residual data can be sold to anyone. Facial recognition software, for instance, might be sold to a Chinese company that supports the oppression of the Uyghurs in China, or that helps track down advocates of democracy in Hong Kong. That way our precious family photos might be used by Facebook to facilitate authoritarian regimes. And our privacy is guaranteed. Because it's not our faces that are sold, it's the residual data scraped off them. It's very difficult to have a concept of this for a very good reason. It's not because we're stupid. It's because these processes have been disguised. They operate in stealth. They have been engineered to be indecipherable, to be undetectable, to create ignorance in a vast group of all of us that they call users. Our ignorance is their bliss. There are some things that have broken through into the public view that we do know about. So let's talk about Facebook's massive scale contagion experiments. And this is where Facebook experimented with subliminal cues planted in its Facebook pages that would actually influence offline, real-world behavior and emotions to see if they could make people feel happier or sadder using subliminal cues with language manipulation and word manipulation and so on. Well, when the experimenters wrote up this work in the very prestigious scholarly journals that published the results of these experiments, They emphasize two key findings. Number one, we now know that we can manipulate subliminal cues in the online context to change real-world behavior or real-world emotion. We know that we can be successful at doing that. Number two, we can exercise this power These methods, while bypassing user awareness. A large group of Pokémon GO players have agreed with each other in Leeuwarden. Together they go through the city to catch as many Pokémon as possible. Only in the Netherlands, more than 1.3 million people play the game. A very interesting experiment occurs in the guise of an augmented reality game called Pokemon Go. In the game you walk around the real world. In all kinds of places Pokemon are hidden. So you have to go there to collect them. What are we not seeing when Pokemon Go is introduced in our country? It's important to understand that Pokemon Go... It's an augmented reality game that was incubated, developed inside Google for many years. Google being the first pioneer of surveillance capitalism, the inventor of surveillance capitalism. It was invented, Pokemon Go was invented in Google, incubated there for many years, developed... ...there, led by a man named John Hanke. And John Hanke invented an operation called Keyhole, which was invested in by the CIA and later purchased by Google and called Google Earth. Google Earth was a CIA startup initially? Yes, Google Earth was something called Keyhole and it was a CIA-invested startup. So it's important to understand that Pokemon Go was not... Some happy little game that just got launched into the world by a toy company or something. When they decided to bring Pokemon Go to the public, they didn't want to bring it as a Google game. They brought it to market as Niantic Labs, which no one had ever heard of. Just a cool startup with this cool game. So now we have this Google augmented reality game, and it turns out that the big game that is on top of the little game that the children are playing is a game that precisely emulates the logic of surveillance capitalism. So in surveillance capitalism, in the original version online, we predict the click-through rate, and we sell the click-through rate to the advertiser, who pays to get clicks on their... on their website, clicking through ultimately to the buy button. That's what they're hoping for. Now in the real world, business customers paid Niantic Labs, the Pokemon Go company, paid Niantic Labs not for click-through but for the real world equivalent of click-through, which is called footfall. To actually get real bodies with their real feet into real business establishments so that their feet would go fall, fall, fall, fall, tap, tap, tap across the store or across the restaurant or across the bar in order to buy something. So I could order by Niantic Labs a Pokémon in my ice cream parlor, for instance. All of these establishments, they're buying what are called, here's the term, lure modules. Modules like a gym that lure people to you. Not so that they will come and be happy, but so they will come and spend money in your shop or your restaurant. Starbucks, McDonald's, everybody was making money. Everybody was making money. Niantic Labs was making money, and all these businesses were making money. And the people playing the game had no idea. So they used the rewards and punishment of the game to herd you through the city to the places that were paying for your body. This was the game of Pokemon Go. This was the real game. The Shadow Game. Getting you into a place where we have predicted that you will be. So that our predictions are worth more. If I can guarantee you're actually going to be there, my prediction that you're going to be there is worth a lot more. Economies of Action are how I guarantee that. And Pokemon Go was a large-scale experiment. A global-scale experiment. In economies of action, using remote control means to automate behavior, to engineer behavior, to fulfill others'commercial ends, while you are having a great time, you are intended to be... In the feeling of being served, you are intended to be saturated with convenience so that you will not notice and you will not complain. And all of this shadow operation will remain hidden because you will not ask questions, because you're so busy being entertained. So it's no longer enough to just have what you're doing online, you're browsing, you're sending messages, you're sending emails. We want to know what you're doing. We want to know about your walk in the park. We want to know about what you're doing in your car. We want to know about your home and what you're doing in your home. What if home security was different? What if it looked different? Well, it's a beautiful system. Some bright engineering-oriented person discovered that the Nest security system has a microphone built into it. That microphone does not figure in any schematic. When you buy the security system and it has the piece of paper that you unfold and the schematic or you go online to learn about it and there's a schematic, it does not show a microphone. It does not discuss a microphone. Now, why would you have a microphone there? Well, remember, what is our business? Our business depends upon extraction of behavioral surplus, scale, scope, and action. So what better device to extract behavioral surplus, especially new forms, voices, conversations? What you're watching on television. what music you're listening to, who's coming in and out of your house, whether or not you're shouting at each other over the breakfast table. All of this has tremendous predictive value. Voices are what everybody's after, just like they're after faces. So now this becomes public. There's a, hey, Google, what's up with this? microphone in your security system. What does Google say? Oops, so sorry. Oh, we didn't even know there was a microphone there. Oh, sorry, sorry. Why? Somebody put that microphone there, but we never intended to use it. This is their business. To obfuscate, to misdirect, to engineer our ignorance with mechanisms and methods that are undetectable and indecipherable. And if they confront you, deny it. Deny it for as long as you can. as possible until they habituate. If there's some element that fails to get habituated, then create an adaptation. All right, we'll make sure all those microphones are... somehow shut off so that they can't be used. And then, wait a minute, and when no one is looking, redirect it so there will be a microphone in something else. There will be a microphone in the home device or in the music player or whatever it might be. It will be back. What if the measure of it working Alright guys, here we go. Daddy! was that you never had to think about it? There's two legal scholars, they're at the University of London, and they analyze the privacy policy, how we give our consent to these devices, right? So, they analyze... these documents for one Nest thermostat. What happens is the thermostat collects data. It sends those data to third parties, and those third parties send data to third parties, ad infinitum, an infinite regress, where your data's going goodness knows where. And no company takes responsibility for what the third parties that it's sending your data to may do with your data. Ness will say, if you don't want us to take your data, and you don't want us to send it on to third parties, that's okay. But be aware that without your data, we will stop supporting the functionality of your thermostat. We will stop upgrading the software. Be aware that the smoke detector may no longer work. Be aware that the pipes in your home may freeze. So now the functionality of the device. ...is held hostage to your agreeing to the privacy contract. And they say, by the way, even if you agree, and we maintain the functionality... We're sending it to these other third parties, and they're going to use it the way they choose, and we take no responsibility for what they do with it. All right. So now, these two scholars do an analysis of one Nest thermostat, and what they conclude is that given these arrangements, any self-respecting consumer, anyone who's even a little bit vigilant about their consumption habits, should review. A minimum of 1,000 privacy contracts in order to install just one single Nest thermostat in your home. You have three minutes to exit. What if it gave you time? And what you really need from home security? A sense of security. So they start extracting data at scale. First it's everything you do online, and then pretty soon they need more data, but then they also discover that we need varieties of data. It's not just volume, but it's different qualities of data. The problem in mobility is exactly the same as the problem we experience on the internet with your smartphone. That you pay with your privacy without knowing it. Because the car knows exactly what you do and where you go. There are 15 cameras on a modern car. If you have access to 1% of all cars, you know everything that happens in the world. If the data is worth more than the cost of driving a car, who knows, you'll get a free car built. And you pay with your privacy. Just like with all the free services you get from Google, it might also be worth it to drive a car. Well, surveillance capitalism broke through in Silicon Valley 2002, 2003, and 2004. It changed the bar for investment. Now you had Google making money on the basis of what I call the surveillance capitalism. Now, what venture capitalist wants to invest in a firm that's, you know, just making an app, when it can invest in a firm that's making an app plus a surveillance dividend? So what happened right away was that the investment started flowing to the people who were making more money, and that more came from the people who were making more money. the surveillance dividend. And now that is the structure that is flowing across the whole economy. Why bother to put all that effort into the engineers and the factories and the science to make a car that runs without carbon, when all we need to do is sell the data from the darn vehicles, and we've got the surveillance dividend, and all those investors are going to come to us and give us the kind of ...market capitalisation that the leading surveillance capitalists have. That's our new lease in life, that's our path forward in the 20th century. Everyone is on the edge of that data server. That's why Google wants automotive. It's not worth it to earn so much with cars. instead of figuring out how to make a car that runs carbon-free, what we're going to do is repurpose our automobiles as surveillance vehicles, and we're going to stream data from the 100,000 people who are driving around in four cars. Ford cars. We're going to combine that with the data we have from Ford Credit, where he says, we already know everything about you. And these data sets are going to put us on a par with the likes of the great surveillance capitalists who wouldn't want to invest in Ford Motor under those circumstances. The primary economy, what we perceive as primary economy, build a nice car, grow some nice food, build a house, means zilch in this surveillance economy. So this began with Android. Once Google acquired the ability to make the Android phone, there were a lot of people in Google who said, oh, this is fantastic. Now we have a phone like Apple. And we can sell a phone and we can make a great margin on that phone. And that margin is going to fund our profits and this is going to make us really rich. But the wiser heads in Google prevailed. The people who already understood surveillance capitalism prevailed. And they said just the opposite. We want the phone and everything associated with the phone to be as cheap as possible. In fact, If we can get the price to zero, that's even better. We want everybody to have a phone. Because the more they have the phone, and the more they spend time on the phone, the more data we get from them. The more behavioral surplus streams into our supply chains. So if we can give it away, we're going to give it away. Google's free mobile operating system, Android, means Google... holds the key to almost 90% of the world's smartphones. In order to obtain as much data as possible from all these cell phones, Google experimented with network balloons in those parts of the world where mobile internet is not available. Facebook, not to be outdone, flew network drones over growing markets and offered free internet in combination with the Facebook app. And connecting them represents one of the greatest opportunities available. to humanity today. Hi, Mark. Why are you showing so much interest in India? Answer honestly. Our mission is to give everyone in the world the power to share what's important to them and to connect every person in the world. Aware of the potential dangers of American data robbery, India politely declined the offer and made do without Zuckerberg's generosity. In other parts of the world, Facebook's true intentions are becoming clearer every day. But we need whistleblowers for that. Facebook is happy to take our data, but not prepared to share information on how the company works. So this is a document written by Facebook executives in Australia, and what they told their business customers there was that we have so much data about six... 0.6 million Australian young adults and teenagers. As a result of that, we can predict mood shifts. We can predict when they feel stressed, fatigued, anxious, inferior, frightened, all of these kinds of very personal feelings. And we can alert you. The exact moment when they are most likely to need a confidence boost. Let's say there's a young person who's contemplating a date over the weekend, and it's now Thursday night and their anxiety is peaking, and they need a confidence boost. If you send them an ad for a sexy black leather jacket, send it right now. Offer free delivery. Tell them you'll have it at their door by the time they wake up in the morning. Give them a discount coupon, right? You're going to sell that black leather jacket. Do it now and we can tell you the exact moment when they are at peak vulnerability. That is real. That's happening. And if you don't believe me, all you have to do is just pivot a few degrees. To something that happened in 2018. And what is that? That came from yet another whistleblower. His name was Chris Wiley. And he told us about Cambridge Analytica. Chris Wiley, a former employee of the British company Cambridge Analytica, sounded the alarm about the methods used by this political marketing business. Cambridge Analytica used the Facebook data of more than 80 million Americans to analyse the best ways of manipulating American voters. One of the things that Chris Wiley said when he broke this story with The Guardian back in 2018 is he said, we knew so much about so many individuals that we could understand their inner demons and we could figure out how to target those demons, how to target their fear, how to target their anger, how to target their paranoia. And with those targets, we could trigger those emotions. And by triggering those emotions, we could then manipulate them into clicking on a website, joining a group, telling them what kind of things to read, telling them what kind of people to hang out with, even telling them who to vote for. Now, that is absolutely no different. than what Facebook aimed to do with these young people, innocent young people in Australia and New Zealand, to target their inner demons. The same mechanisms, the same methods, only pivoted just a few degrees from commercial outcomes to political outcomes. Cambridge Analytica was nothing but a parasite on a huge host. And that host is surveillance capitalism. Hi Ella, hope you're settling in okay. You know how I always say, everything you need is already within you. The truth is, there's only so much you can do by yourself. Welcome to the office. We all need people to get where we're going. Ask yourself who you're going to do it with. Welcome to F8. Today, we are going to talk about building a privacy-focused social platform. So that's why I believe that the future is private. So, we are all good then, Professor Zuboff? Well, in June 2019, which happened to be just a couple of months after that grand announcement that the future is private, There was a very interesting court case that was being tried before a California judge. And this court case was a class action suit brought by individuals who were demanding from. Facebook, they wanted to have the data that Facebook had made available to Cambridge Analytica to see if they had been targeted and manipulated by Cambridge Analytica. And this had come to a head in a California court. The Facebook counsel who argued this case stood in front of the judge and said, that any Facebook user gets on the platform. and typically is sharing information with 100 or more people in their network. Once you've done that, you have no expectation of privacy that would make you legitimately able to claim any kind of privacy right in a suit like this or any other kind of privacy-oriented suit. And so here we have, again... the public operation and the shadow operation. What's happening backstage is a lawyer standing in front of a judge saying, no Facebook user has any legitimate expectation of privacy. This is the next chapter for our services. Do you have some Google products you use, if I may ask? No. How do you survive? Perfectly well. If there is an occasion where I feel the absolute need to use Google, I go through various layers of... encryption and location reconfiguration before I go to Google search just on principle. There is a way to escape the omnipresent eyes of Google and Facebook you can use a VPN and send all your data traffic through a secure server in another country but big tech won't lose any sleep over individuals resisting the surveillance capitalists. I don't want anyone to get confused that if you forgo Google search you are somehow, you know, fighting the fight. Because this is a collective problem and it requires collective action. With us Amish, we are dependent on public transportation for any distance. And local travel, we do it in our slow way. And I think the horse and buggy slows us down and really helps the overall value that we're trying to create and preserve and not get carried away. A regular computer with your safety walls, you know, where you can't get on porn sites and things like that. Now, when our lockdowns that we have, that is totally locked down. with this computer? No, no, absolutely not. It will not play a movie picture or music or anything like that. It's not a computer that was torn down. It was built from the ground up like this. Especially for the Amish? Yes. Would you mind showing me your mobile phone? Okay, yes. The only thing you can do with that is call. You cannot text, do anything like that. It's just a portable phone. So this is where it stays. But as far as going home, no, I don't have it. I keep it right here. That's for the business. The technology is progressing so fast it's hard to deal with. We have to teach what the moral impact is in being involved in this so each individual can make better choices for themselves. We use technology as long as we use it and it doesn't get to the point where it uses us. and controls us. That's the bottom line of it. It's kind of strange that these people, from who you can say are living in the past, maybe they're living in the future. It was just a minute ago. That we didn't have many of these tools and we were fine. We lived rich and full lives. We had close connections. friends and family. So having said that, I want to recognize that there's a lot that the digital brings to our lives. And we deserve to have all of that. But we deserve to have it without. paying the price of surveillance capitalism. And right now we are in that kind of classic Faustian bargain. 21st century citizens should not have to make the choice of either essentially, you know, going analog or living in a world where our self-determination, our privacy are destroyed. for the sake of this market logic. That is unacceptable. And let's also not be naive. You get the wrong people in charge of our government at any moment, and they look over their shoulders at the rich control possibilities offered by these new systems. And there will come a time when even in the West, even in our democratic societies, Our government will be tempted to annex these capabilities and use them over us and against us. Let's not be naive about that. When we decide to resist surveillance capitalism, right now, while it lives in the market dynamic, we are also preserving our democratic future and the kinds of checks and balances that we will need going forward in an information civilization. if we are to preserve freedom and democracy for another generation. The European Union already has legislation with which it regularly wraps American tech businesses over the knuckles. Google was fined several billion dollars because it had abused its monopoly position. In theory, European citizens are protected against data robbery by regulations for fair competition and by the GDPR, the General Data Protection Regulation. And do you think that the EU data protection law is a step in a good direction or is it sufficient? We have privacy laws and the GDPR is the furthest frontier of our privacy laws. And we have antitrust laws. The thing is that as important as these laws are, we still need more because surveillance capitalism is unprecedented. And so we're going to need the laws and the regulatory regimes that respond directly to these new unprecedented operations. GDPR talks a lot about data ownership, data accessibility, and data portability. These are very important things. If the only data that we're talking about is the data in the public operation, the data that we've given to the corporations. But as we have seen, most of the data that they use to feed into the factories, to produce the predictions, these are data that we haven't given. Or if we've given, we don't even know that we've given, because it came in our exclamation points, it came in our walk in the park, and picking up the cadence of our voice. and the timber of our voice, all rich predictive signals. So no matter how much data ownership we claim, or accessibility, or portability, most of the data is in the shadow operation. And the shadow operation is never coming to us. We will never get those data. They claim them as data that they own. They took it from our lives. They took it from our private experience without our permission. They analyzed the data, they made it into products, they sold the products and they took the profit. Illegitimate profit, because they took it at the beginning without asking, without our knowledge. Recall, bypassing our awareness. Do the EU have a chance against the behemoth of surveillance capitalism? These 20 years have been the honeymoon for surveillance capitalism because they have been 20 years largely unimpeded by law. Unimpeded by law. Why? Well, the most important reason is that they're doing things that have never been done before, so there are no laws against it. Just like factories employing children. You know, in a mass production factory. There hadn't been anything like that before, and there were no laws against it. And it took our societies a while to sort of wake up. We stood up against the extractive companies and their violence. The companies that came to be known as the Gilded Age. The companies where we turned around and we called their leaders robber barons. That's not what they were called at the time. At the time they were worshipped as these, you know, wealthy gods and geniuses who knew how to wield the might of machines and capital. If we had been trying to stop it, to curtail it, to outlaw it, for every one of these 20 years, and we had utterly failed, then I might, you know... Put my chin in my hand and say, gosh, I'm starting to feel like things might be really bad. But that's not the case. We haven't even tried yet to stop it. Look, surveillance capitalism is 20 years old. Democracy is several centuries old. I bet on democracy. Don't forget to subscribe to our channel and we'll keep you updated on our documentaries.