Wednesday, February 17, 2021

The age of Surveillance capitalism - book review


 

It's the second time I listen to an audio-book and then go and purchase a hard copy one (the previous one was Data and Goliath), and much like it, I feel that it would be beneficial for just about anyone to read, and just like Weapons of Math Destruction should be read by anyone who works as a data scientist (or frankly, developing an "AI" solution of any sort), so does this book should be read by any software engineer, especially if they are working in a data hogging company (and yes, most people might not notice how much data their product is collecting before reading this book, so just read it).

One thing to remember before we dive into the book - The first and by far the most important takeaway is that breaching our privacy is not about knowing our secrets, it's about manipulation, coercion and control.

The book itself is divided into three parts depicting the advancement of surveillance capitalism - from its foundation, mostly attributed to Google, to the ways it has moved to influence the physical world and then to the social world, each step providing it with more instrumentation capabilities. The main challenge the book raises is that surveillance capitalism is "unprecedented", that is - we've yet to see something like it and therefore the discourse about it is unfit and lacking, hindering us from grasping the potential cost our society is paying, from understanding the mechanisms behind its operation and from drawing boundaries in appropriate places. The fourth part (as the author counts four) is a conclusion

The term "surveillance capitalism" is not coincidental, nor is it neutral - the basic claim in this book is that like the early capitalist "robber barons" who, in the name of making profit have committed atrocities and invoked their "right to private property" to defend their ability to extract  wealth out of unclaimed lands or "freedom of contract" as a way to defend child labor or minimal wages with horrid conditions, so does the surveillance capitalism harm society in their actions while hiding behind verbal jiu-jitsu and "freedom of speech\press" 

But what is surveillance capitalism anyway? While the book is a bit more detailed, the short answer is that it's an industry that harvests personal information and produces predictions of human behavior (namely - on which ads we'll click, what products we'll buy, and what message will convince us to vote). In order to improve those predictions - it moves to shaping said behaviors. No, no need to reach for your tin foiled hats, but there is  reason to be worried.   

Here's the part that could be a summary of each chapter, one that will be much shorter and won't do it any justice, so instead I'll share some of the key points that stuck with me so far. 

  • The first one is that the entire business is shrouded in euphemisms and intentional misguiding - from terms such as "data exhaust" or "digital bread crumbs" hinting that the data about our online activity is useless and that making it into something useful is not only harmless - it's benign and similar to recycling, to "personalization" which is an intentional mask to the fact that the profiling is not done to the benefit of the user. 
  • We are not the product, we are the resource being mined and cultivated for the raw materials that compose the product. We are not Ford's T-model, or even the metal used to build it, we are the ore being mined for our data. While the book does not go there, it's quite clear to me that this setup is a strong incentive to squeeze us for as much as possible - if the raw materials are second grade, the quality of the product is hindered, but the source of those materials is far enough the chain so that it's cheaper to dig deeper and mine more than it is to maintain and cultivate it. 
  • Science finds - Industry applies - Man conforms: This motto from Chicago's Fair of 1933 provides a succinct reminder to how much technology changes the way we behave: we use calculators, we drive with a navigation app, and we prioritize communication with people far away over those we're sitting with right now, simply because they ping us through our phone. In similar manner we've all been accustomed to various services provided by the surveillance capitalist firms, and we learned to depend on them. 
  • Breaking the law is a business strategy. The playbook goes roughly that way: Invent something never before seen, or that is not explicitly and clearly defined by law, start doing it. When being sued or when attempts to enforce existing regulatory mechanisms are starting - delay. Finally, claim that people are relying on the service you provide (and have been providing during the entire delay period). It can be Google Street view that drags court orders for years and in the meanwhile continue to operate and update, infringing on people's privacy in the meanwhile, and we can see this being done with Uber & Lyft trying to fend off their drivers employee rights by delaying and extending the litigation process as much as they can. 
  • Surveillance capitalism is pushing towards omnipresence - thanks to IOT and "smart" everything, there are multiple sources of information for just about anything - "smart" cities sharing our location data with Uber &co under the excuse of optimized traffic routing, our sports heart-rate meter sharing health information with a running app, Roomba-like devices mapping our apartments and the Nest thermostat communicating with our door lock and keeping track on our daily routine. 
  • It's not about knowing, it's about nudging. Facebook did an experiment of increasing voter turnout, Pokemon go was herding people to stores that paid to have that privilege. While it isn't mind control, imagine this scenario - Next elections, Facebook uses its own data to determine the users expected vote and showing the effective promotions only to those voting "right", tipping the scale just a bit. Too far fetched? What if it was about "encouraging" people to vote against a law that would limit Facebook's ability to make money? 
  • Industrial capitalism, while devastative to nature and natural resources, was rather beneficial to people since it was relying on ever increasing scale of production, which means affordable products (also, if people can't buy, it's worth paying them a salary so that they'll have money to spend on luxuries). Surveillance capitalism, on the other hand, is parasitic by nature, it depends on scale of extraction. Surveillance capitalism companies make their profit out of the few rather than out of the many, and while today most of the revenue comes from product advertising, even in the remote case no products will advertise anything, the market for behavioral predictions and influence will always have buyers - Every political actor - Be it formal candidates, NGOs or even dissidents and malicious actors.
  • Not that we didn't know that beforehand, but users are forced to sign unfair and unreasonable "contracts" and submit to a misleadingly named "privacy policy" (or, as it being called in the book - "surveillance policy") which might change at the company will at any time.Refuse to submit? you are hereby denied service in an disproportional scale, sometimes forcing you to stop using a product you purchased or keep it non-functional and even dangerous.
  • Last, and probably the most important point: It does not have to be this way. Search queries are not stored because "that's how search engines work", they are stored because the company makes profit from storing and using them, mining our email to enrich our hidden user profile is not just the price of getting a "free" inbox that syncs to our "free" calendar, it's a dangling bait to make sure we continue to provide the company with more data - both about ourselves and realistic samples for making datasets for various ML training that would in turn enhance its ability to access more data. Since the way things currently are is a choice, it can be changed to a more balanced and fair system. 

 

The book, while remarkable, is not free of weak spots - People who take this book at face value can get the impression that surveillance capitalism controls every aspect of our lives, to the point where we lose the will to will (there is, in fact, a chapter that has this very claim). While I don't think it's wise to dismiss this risk, we are not there yet. Most manipulation tactics used to nudge people actions are not super-effective, and people devise defense mechanisms against them, so that effect is erodes over time (think on how accustomed you've become to glance over web-ads without really noticing them and how quick you are to categorize them as such) - in this case, I believe it's best to look at intentions. Even if the power is limited, nudging-power is what those companies sell, so it's fair to assume they'll get better at it over time but even if they never make it to a really scary level, the attempt of perfecting coercive power is harmful in itself. 

Another thing I found less effective in the book has a somewhat Marxist feeling to it. The terminology it tries to coin, being put out to oppose surveillance capitalism, is very similar to the terminology used to oppose industrial capitalism - "means of production" is scattered in the book, and a good place is dedicated to the "division of learning", and to what the author tries to convince the readers (not sure about how successful this attempt is) that the essential questions we should be asking all the times are "who knows? who decides? who decides who decides?" While reading I found myself agreeing with the message while at the same time finding my defense mechanisms raised high by this choice of words. Perhaps it's by design, perhaps it's just an effect of having a different milieu which makes me resist those idea because of this wording. 

 

So, to sum everything up - I found this book important mostly because it is a very good first step. It's a first step towards creating a critical language when evaluating data collection and usage, and a first step towards reclaiming our right to privacy and control. I hope it will foster deeper, clearer discussion about how are we moving forward to create the right sort of technology, supported by an appropriate regulation to make a better society.

No comments:

Post a Comment