007: Surveillance Capitalism #Tech Series

 
S1E07-IGP.jpg

Companies and governments have gained new powers through a problematic market that trades in predicting and influencing human behavior by collecting data. Many people are not even aware of, let alone have consented to, the extent to which they are being tracked and analyzed. The boundaries of privacy, security, and convenience are blurred. Are we sleepwalking into a disaster, or will companies embrace a more ethical use of our information?

Listen & subscribe to the podcast on Anchor | Spotify | Google Podcasts | Apple Podcasts


Introduction

00:00 - The technology era has changed the course of human history in many ways, some of whose long-term effects we’re not aware of yet. With technology comes new ways to do old things and we all know nosy neighbors have been around since the dawn of time. But what happens when peeping Toms turn into tracking cookies? Or when living on the grid turns into living in a lens? Technology has fueled an unprecedented level of surveillance, launching the art of spying into its own form of business. We’re a shift from product to smart product and from service to personalized service. In this episode we’ll break down what is surveillance capitalism, how we got here, the problem, and possible solutions. So cover up your webcam, switch to private browser mode, and let's dive into the first episode of the tech series.


Dubbed “surveillance capitalism” (and abbreviated in this podcast as “survcap”), this machine now touches almost every economic sector: everything from security via public transportation or the military, down to advertisements in online shopping. It has powerful corporations predicting, and even controlling, our behavior then massively profiting off of it by buying and selling this information. These corporations or private interest groups are called “Big Other” and have been able to collect vast and concentrated amounts of knowledge on people, all without democratic or legal oversight. Recently the GDPR changes in Europe had marketing managers scrambling to be in compliance. The purpose is so that users have a choice about how their data is used, and be fully-informed on its uses in order to make the right decision for themselves. If that freakout happened for just emails, imagine the impact on regulating larger industries. Sure, we could point the finger at the big bad Facebooks and Googles of the world but what about the role consumers play? Consider how easily privacy is given up for the sake of convenience, for the illusion of security, or simply because something is free. You have your passwords saved in browsers so you don’t forget, fingerprints stashed in your phone, opted in for suggested products based on your shopping history, give your email for freebies, give Google maps your home address to shave off a few seconds of typing it in each time, and so much more. Think of all the “smart” or voice-activated devices you have or how many of them are networked between each other, constantly sharing your information.


The Definition

02:50 - Before we spiral out into post-apocalyptic scenarios for society, let's clarify what Surveillance Capitalism is. A great definition is offered up in a book by Shoshana Zuboff called “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power”.

Shoshana defines it as a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales; It basically takes private experiences and turns it into behavioral data. Sometimes that data is used to improve products or services, such as a health tracking app that provides personalized recommendations. But what happens to the extra information that’s collected? The stuff you don’t know that you shared or didn’t give permission to share? This surplus data isn’t required for the app, for example, and doesn’t have an immediate use but when consolidated, it can be just as powerful as the data you willingly submitted. This is information like where you were when you opened the app, who your contacts are, ads you skipped, etc that all creates a profile. That surplus data is most likely used for toward a bigger picture and more profitable means. Throw in some fancy machine intelligence (like advanced machine learning or artificial intelligence) and out pops predictions of human behavior.

So you have behavioral surplus suppliers but who are the customers buying predicted behavior in this new marketplace? Enterprises. They buy and trade the information, hoping to get a glimpse into how you’ll act in the future so they can profit. Third parties use this data for revenue streams, fueling the survcap eco-system that is designed to be hidden. It’s happening in a way meant to bypass our awareness and is engineered to keep us ignorant. Surveillance capitalism is a rogue mutation of capitalism involving concentrations of wealth, knowledge, and power.

Now we know what the game is and how it’s played, let's dive into the players (and no, it’s not us the people. We are merely the tokens and pieces).

Google invented and perfected survcap. In her book, Zuboff describes how Google, in its early days, used the keywords that people typed in to improve its search engine but didn’t pay attention to the collateral data - the surplus data — information like users’ keyword phrasing, click patterns and spellings. Pretty soon, however, Google got hip to the value of this data and began harvesting this surplus information and combining it with stuff like your web-browsing activities, all to figure out what your interests are and target you with ads. As a pioneer of surveillance capitalism, Google’s deep pockets allowed them to have expansive research and development that led to extensive experimentation and implementation. It wasn’t long before Facebook and Microsoft joined the ranks, with many currently keeping an eye on the big A’s - Amazon and Apple. These players have been making moves without restrictions from law or other competitors. Zuboff compares it to “an invasive species in a landscape free of natural predators”. Survcap has expanded beyond tech industries as more sectors, firms, startups, app developers, and investors get in on the information game. For example, auto insurance companies can know how you’re driving in real time and reward, or punish, you based on your driving performance. Workplace wellness programs charge higher health insurance premiums to employees who decline to wear fitness trackers. Groups mount political influence campaigns on social media platforms.

S1E07-BLOG3.jpg

How Did We Get Here?

06:23 - Knowing what survcap is makes you wonder how did we get to the point where every like, comment, and share is tracked? Both willingly and unwillingly.

Willingly

We did get here a bit willingly. The internet has become synonymous with social engagement and commerce. For many of those with access, it is now integrated with our daily lives - making some even dependent on it. Just like constant exposure to something may normalize it, internet users have become accustomed to being tracked and data-mined. Some cynically chalk it up to being part of the internet territory, some pull the “I have nothing to hide” card, and others hope that ignoring it will make it go away. It won’t - no matter how overwhelmed, frustrated, or helpless you feel about Big Data.

Unwillingly

We also got here unwillingly. The mega companies pivoted from serving users to surveilling users. In the push to harvest more data, the companies sometimes bypassed privacy settings or made it difficult for users to opt out of data-sharing.

Surveillance capitalists know everything about us, however their methods are deliberately designed to be hidden to us. They accumulate massive amounts of information from us, but this information is not available to us. To request or retrieve that data is a complicated process, if you’re even allowed access to it, and the lengthy Terms & Conditions are overwhelming and hard-to-understand, at best.


The Consequences of Survcap

  • Competition spawning efficiency that doesn’t work in the consumer’s best interest or favour

  • Blurred lines between public and private

  • Lack of legal guidance or formal legislation

  • Undermining of democracy


The Problem/The Consequences

07:51 - We’re here now and you’re thinking, life is pretty sweet. I can open my bookmark on any device and Ticketmaster sends me recommendations of concerts I might like. But there are some problems with survcap.


For one, competition spawns efficiency and not in the way that’s good

This predictive behavior field is highly competitive. It makes surveillance capitalists look for new sources of data, such as our voices, personalities, and even our emotions. Beyond just suggesting things to consumers, they can then coax or herd behavior toward profitable outcomes. The shaping of behavior at such a large scale is currently largely unregulated. How many times have you seen a company apologize for a data breach, send out some generic “we take your privacy seriously” email, and never get held accountable for the leak? Or a company to blatantly admit that it had a microphone in a product that consumers didn’t know about then claim they didn’t intentionally keep it a secret? It then falls on the consumer to ensure their bank accounts haven’t been hacked, that someone hasn’t stolen their identity, or at the very least, change all their passwords.

With access to these new data sources, some say survcap is a slippery slope that will evolve from automating information flows to automating the behavior of people - full on behavioral modification, all from under our noses. With that much knowledge, one has to consider what can be done with that power.


A second problem of survcap is the blurred lines between public and private

We’ve also seen how freedom of speech was impeded by controlling or restricting of communication on tools such as WeChat or Twitter - or worse, how those were used to bring criminal charges against people. We’ve seen people get fired from their jobs based on what they said on social media or what causes they support in their freetime. When does this all turn into a witch-hunt?


A third problem of survcap is the effect on laws and the legal system

Companies like Google and Facebook have moved so fast that the law and public institutions haven’t remotely caught up to them. At best, they are reactive to what happens, such as large data breaches.


The fourth problem survcap can create is the undermining of democracy through “Instrumentarian Power”

The evidence from how these big companies have acted leads us to believe that survcap is driven purely by wanting to profit, even at the expense of social norms and individual autonomy. Violating those means undermining democracy itself.

They just want our data. They don’t care what we believe, if we’re happy, if we’re sad, if we’re in pain, or if we’re in love. They only care that whatever we are and whatever we do, we interact with their supply sources. They are indifferent to the content of our behavior. They just want to have the data from our behavior.

Shoshana calls this instrumentarian power and makes it clear it’s not totalitarian. Totalitarian power acts through terror. Instrumentarian power wants to control you, but it doesn’t care about hurting you. It just wants to control you toward its guaranteed commercial outcomes. Anybody with enough money can buy the skills and the data to use these same methodologies to influence political outcomes. Thus undermining democracy.

S1E07-BLOG.jpg

10:48 - For those of you that like a good dystopian nightmare, you might see where this is going. For those of you who shake your metaphorical fists at “crazy conspiracy theorists”, you might want to step away. A huge concern of surveillance capitalism is it eventually leading to population-sized experiments of mass behavior modification. Shoshana suggests that games like Pokemon Go are dry runs for Google City. Pokemon Go, created by Niantic Labs, a Google operation, shows at a smaller scale how you can herd populations, modify people’s behavior, and even steer them toward commercial outcomes. Apps like these have businesses paying for foot traffic the same way that online advertisers pay for click-through rates. Now let's scale that up. Google proposed building a “smart” city in Toronto’s waterfront. Sidewalk Labs, a sister company of Google, came up with a $50 million design for a dozen acres on the waterfront. The idea is to create “the world’s first neighborhood built from the internet up,” as Sidewalk describes it. The “smart city” would be a sensor-enabled, highly wired metropolis that can run itself. The data from a variety of systems would feed back into the city, which would constantly learn, and optimize its own operations over time. In a city that runs on data and algorithms rather than decisions made by humans, questions are immediately raised: Who owns and controls all the data produced? Whose laws apply? What about the businesses that then pay-to-play in this real life game? Urban-studies seminars will have a blast with this, I’m sure.


Are there solutions? You can…

  • Accept the status quo

  • Start holding companies accountable

  • Control your own privacy

  • Shift what you value


The Solution/What next

12:30 - All that doom and gloom of the eye of Sauron beaming down on you is enough to make you want to get a tiny house and live in the woods. But don’t worry… we’re not past the point of no return. For example, mass production initially had no laws to constrain it. There were unsafe working conditions, people were paid little wages, and children worked in factories. It took decades, democratic resources, and people protesting but eventually law and regulation shaped the entire industry. While survcap has gone pretty far in some sectors already, I do think it’s possible to reign it in and reshape it.

So what can you really do? Quit FB and Google? Demand accountability? Mope in apathy? Perhaps a solution is in a mix of these…. Except the apathetic part.


Accept it as-is

Take the stance that it’s too late to change, enjoy the benefits, and hope the Regulation Warriors will step in to keep Big Other in check.


Say “I like the benefits of big data but let’s look at the costs” and hold these companies accountable

With great power comes great responsibility and the power of survcap is in the information that’s hoarded. So why not hold these mega corps responsible for how they get info and what they do with it? We’ve seen how they can completely change how information is distributed and consumed, even affecting political aspects such as elections. One way to combat this is for the people to hold the corps accountable. Force them to improve their crappy track records on privacy and to rebuild trust with users. Take action to ensure there’s legislation that makes it illegal for the collection of this ancillary data, or at the very least, disclose what is being collected. Ensure it’s regulated by clearly defining terminology such as “surplus data” or having consequences that can actually be enforced and will actually hurt the company if, or when, they violate a law. Let's be real, a little $5M fine is nothing to a company that easily makes hundreds of billions of dollars. Fight for apps that allow a user to opt out of granting permission for unessential functions, such as letting your Google keyboard also access your camera, contacts, and SD card.


Want another stance you can take? Take your privacy and data into your own hands as a consumer, as much as you can at least

Parts of these huge data collection pipelines are being built by users themselves, us. People who are co-opting into sharing their data so control what you do share and know more about what you don’t.

Simple tasks such as using one browser for personal activity and another for just surfing the web (who uses that phrase anymore) can put you on the path to hiding in the shadows from Big Other’s gaze.

Look at the popular social media platform you use and actually go through all those security settings such as disabling ad targeting. Every internet-enabled device you have essentially feeds the supply chain so review your phone, tablet, or laptop. See which apps have access to your phone features such as camera or microphone. I know they’re lengthy but read, or at least skim, the Terms and Conditions. Demand companies make these more transparent, concise, and clear. Some companies have started sending recap emails that explain changes in their Terms without any jargon.

If you’re in a field that is a supply source of data, don’t participate in the technological advances needed to keep these companies powering their surveillance. The coders, the artificial intelligence architects, etc - withhold selling your innovation to the big corps. Consultancies, stop encouraging non-tech sectors to use survcap in nefarious ways.

S1E07-BLOG2.jpg

Possibly the biggest change? Making a fundamental shift in what you value

Alternatives exist for every feature and app offered by these companies, and they are not hard to find.  And yet. When consumers start to think about the costs, they gasp. There’s sometimes the costs of the products themselves, but more importantly are the switching costs that come with using a new product. The costs aren’t necessarily financial: they can be intangible such as time - getting your family and friends to use a new secure messaging platform or learning how to use new apps. There are companies that have even made privacy their distinguishing feature. And consumers respond pretty consistently: their actions have shown they will take free with surveillance over paid with privacy. They are willing to give up privacy for free.  People love free stuff, particularly when the harms are difficult to perceive.


Closing

17:05 - Knowledge and power will always be sought, even more so when there is profit to be made. Surveillance capitalism lies at the intersection of all three and is no longer confined to internet companies. After becoming the default model for most internet-based businesses, it’s now expanding into the offline world. The same mechanisms that track your online clicks and likes can be used to track your jog, your chat at breakfast, or how long you looked for a parking space. As technology advances, it continues to blur the lines between public and private so it’s up to us to define those lines instead of passively paying for our own domination. Safety and freedom isn’t an either-or choice. You can have both. You can even have those with privacy. Let me know what you think, reach out on Twitter or Instagram @sublationstudio or leave a comment on the transcript page. This episode will self-destruct in 3...2...1…

If you’re totally creeped out by this or maybe you see the benefits, let me know on Twitter or Instagram @sublationstudio - Links/resources are available with the transcript at sublationstudio.com/podcast