Tools:

Digital Online Checker 

Sample profile

How to Check your digital footprint

Or, just enter your name, city, state into a web browser…

DeleteMe

DeleteMe Review

The Goal is to Automate us…

nobody in Mainz (Gutenberg’s home town) in, say, 1495 could have known that his technology would (among other things): fuel the Reformation and undermine the authority of the mighty Catholic church; enable the rise of what we now recognise as modern science; create unheard-of professions and industries; change the shape of our brains; and even recalibrate our conceptions of childhood. And yet printing did all this and more.

Google, Facebook et al were doing – nothing less than spawning a new variant of capitalism…surveillance capitalism

“Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”

First of all there was the arrogant appropriation of users’ behavioural data – viewed as a free resource, there for the taking. Then the use of patented methods to extract or infer data even when users had explicitly denied permission, followed by the use of technologies that were opaque by design and fostered user ignorance…the entire project was conducted in what was effectively lawless – or at any rate law-free – territory.

Nearly every product or service that begins with the word “smart” or “personalised”, every internet-enabled device, every “digital assistant”, is simply a supply-chain interface for the unobstructed flow of behavioural data on its way to predicting our futures in a surveillance economy.

capitalism, marked by taking things that live outside the market sphere and declaring their new life as market commodities.[like user personal data]

information technology produces new knowledge territories by virtue of its informating capability, always turning the world into information. The result is that these new knowledge territories become the subject of political conflict. The first conflict is over the distribution of knowledge: “Who knows?” The second is about authority: “Who decides who knows?” The third is about power: “Who decides who decides who knows?”

surveillance capitalism depends upon undermining individual self-determination, autonomy and decision rights for the sake of an unobstructed flow of behavioural data to feed markets that are about us but not for us.

choice mechanisms we have traditionally associated with the private realm are eroded or vitiated. There can be no exit from processes that are intentionally designed to bypass individual awareness and produce ignorance, especially when these are the very same processes upon which we must depend for effective daily life. So our participation is best explained in terms of necessity, dependency, the foreclosure of alternatives, and enforced ignorance.

What is Surveillance Capitalism…

Surveillance capitalism describes a market driven process where the commodity for sale is your personal data, and the capture and production of this data relies on mass surveillance of the internet.

…the global architecture of computer mediation […] [which] produces a distributed and mostly uncontested new expression of power that I christen: “Big Other”.

Currently, the biggest “Big Other” actors are Google, Amazon, Facebook and Apple. Together, they collect and control unparalleled quantities of data about our behaviours, which they turn into products and services.

wearables, smart home devices, drones, connected toys and automated travel. Sensors such as microphones, cameras, accelerometers, and temperature and motion sensors add to an ever expanding list of our activities (data) that can be collected and commodified.

Surveillance capitalism Apathy

surveillance apathy… it’s something that may be particularly significant for marginalised communities, who feel they hold no power to navigate or negotiate fair use of digital technologies.

data surveillance “could be used to hide more explicit forms of discrimination”.

ignorance and cynicism are often behind surveillance apathy. Users are either ignorant of the complex infrastructure of surveillance, or they believe they are simply unable to avoid it.

in contrast to the oppressive panopticon (a circular prison with a central watchtower) as envisioned by philosopher Jeremy Bentham, we have what Siva Vaidhyanathan calls the “crytopticon”. The crytopticon is “not supposed to be intrusive or obvious. Its scale, its ubiquity, even its very existence, are supposed to go unnoticed”.

Surveillance apathy can be linked to people’s dependence on “the system”. As one of my media students pointed out, no matter how much awareness users have regarding their social media surveillance, invariably people will continue using these platforms. This is because they are convenient, practical, and “we are creatures of habit”.

At the Frontiers of Surveillance Capitalism

Zuboff describes in her new book, The Age of Surveillance Capitalism. She explains that Silicon Valley firms are looking to wearable technologies and other smart devices to gain an increasingly detailed view of our physical and emotional health.

by refusing to acknowledge the continuities between past modes of exploitation and the latest horrors of surveillance capitalism, she ultimately leads readers away from the most promising paths of resistance.

It is in her discussion of market democracy that the limitations of Zuboff’s analysis come to the fore. For her, the market —before the rise of platform monopolies (and, to a lesser degree, neoliberalism)—was characterized by individual liberty and free choice. Accordingly, she is uninterested in how surveillance might deepen the forms of exploitation and coercion that always structured market capitalism, particularly for marginalized and racialized communities.

The problem with surveillance capitalism is as much the capitalism as it is the surveillance.

Now, the enduring questions of authority and power must be addressed to the widest possible frame…information civilization.”

In Google’s early days, she explains, the company linked advertising only to search queries. Meanwhile, the vast quantities of data that it gathered about particular users (including “the number and pattern of search terms…dwell times, click patterns, and location”) were used only to improve users’ experience. The 2003 patent, however, promised to convert that “data exhaust” into “behavioral surplus” that could be used to increase the precision of targeted advertising, a much more lucrative venture. This approach to data collection became so successful, she argues, that it led to a new logic of accumulation: From 2003 on, Google was on a quest to gather and monetize as much user data as possible.

To mine dark data, Google, Facebook, and others are developing smart homes and wearable devices, self-driving cars, drones, and augmented reality. They’re even striving to monitor the body’s inner workings through digestible sensors and to map a person’s inner life through so-called emotion analytics.

Destroy Surveillance Capitalism

What if the trauma of living through real conspiracies all around us — conspiracies among wealthy people, their lobbyists, and lawmakers to bury inconvenient facts and evidence of wrongdoing (these conspiracies are commonly known as “corruption”) — is making people vulnerable to conspiracy theories?

Taming Big Tech is integral to fixing the internet, and for that, we need digital rights activism.

She’s right that capitalism today threatens our species, and she’s right that tech poses unique challenges to our species and civilization, but she’s really wrong about how tech is different and why it threatens our species.

“tech exceptionalism”: Now that tech has infiltrated every corner of our life and our online lives have been monopolized by a handful of giants, defenders of digital freedoms are accused of carrying water for Big Tech, providing cover for its self-interested negligence (or worse, nefarious plots).

what is absolutelytrue is that ad-driven Big Tech’s customers are advertisers, and what companies like Google and Facebook sell is their ability to convince you to buy stuff. Big Tech’s product is persuasion. The services — social media, search engines, maps, messaging, and more — are delivery systems for persuasion.

Surveillance capitalism assumes that because advertisers buy a lot of what Big Tech is selling, Big Tech must be selling something real. But Big Tech’s massive sales could just as easily be the result of a popular delusion or something even more pernicious: monopolistic control over our communications and commerce.

The impact of dominance far exceeds the impact of manipulation and should be central to our analysis and any remedies we see

Segmenting & Targeting:

This is seriously creepy. But it’s not mind control. It doesn’t deprive you of your free will. It doesn’t trick you.

Because targeting improves the yields on political pitches, it can accelerate the pace of political upheaval by making it possible for everyone who has secretly wished for the toppling of an autocrat — or just an 11-term incumbent politician — to find everyone else who feels the same way at very low cost. This has been critical to the rapid crystallization of recent political movements including Black Lives Matter and Occupy Wall Street as well as less savory players like the far-right white nationalist movements that marched in Charlottesville.

It’s important to differentiate this kind of political organizing from influence campaigns; finding people who secretly agree with you isn’t the same as convincing people to agree with you.

Deception

Surveillance capitalism also abets fraud by making it easy to locate other people who have been similarly deceived, forming a community of people who reinforce one another’s false beliefs.

This is pernicious and difficult — and it’s also the kind of thing the internet can help guard against by making true information available, especially in a form that exposes the underlying deliberations among parties with sharply divergent views, such as Wikipedia. But it’s not brainwashing; it’s fraud.

Domination

Surveillance capitalism is the result of monopoly. Monopoly is the cause, and surveillance capitalism and its negative outcomes are the effects of monopoly. I’ll get into this in depth later, but for now, suffice it to say that the tech industry has grown up with a radical theory of antitrust that has allowed companies to grow by merging with their rivals, buying up their nascent competitors, and expanding to control whole market verticals.

Zuboff calls surveillance capitalism a “rogue capitalism” whose data-hoarding and machine-learning techniques rob us of our free will. But influence campaigns that seek to displace existing, correct beliefs with false ones have an effect that is small and temporary while monopolistic dominance over informational systems has massive, enduring effects. Controlling the results to the world’s search queries means controlling access both to arguments and their rebuttals and, thus, control over much of the world’s beliefs. If our concern is how corporations are foreclosing on our ability to make up our own minds and determine our own futures, the impact of dominance far exceeds the impact of manipulation and should be central to our analysis and any remedies we seek.

Bypassing Our Rational Faculties

This is the good stuff: using machine learning, “dark patterns,” engagement hacking, and other techniques to get us to do things that run counter to our better judgment. This is mind control.

Some of these techniques have proven devastatingly effective (if only in the short term). The use of countdown timers on a purchase completion page can create a sense of urgency that causes you to ignore the nagging internal voice suggesting that you should shop around or sleep on your decision. The use of people from your social graph in ads can provide “social proof” that a purchase is worth making. Even the auction system pioneered by eBay is calculated to play on our cognitive blind spots, letting us feel like we “own” something because we bid on it, thus encouraging us to bid again when we are outbid to ensure that “our” things stay ours.

The vulnerability of small segments of the population to dramatic, efficient corporate manipulation is a real concern that’s worthy of our attention and energy. But it’s not an existential threat to society.