The role of ignorance in corporate surveillance

Corporate spies don’t want to be noticed

Jeremy Bentham’s Panopticon prison is often used as a metaphor for the evils of online surveillance (e.g. Bartlett 2018, 27–28; Kosta et al. 2011; Campbell and Carlson 2002). The prison is constructed as a circular building, with a central inspection tower, surrounded by cells (see fig. 1). In the prison each inmate is perfectly visible and perfectly individualised (Foucault 1979, 200). The prison was constructed such that a small number of prison guards could look into any of the cells at any time and still not be seen by the prisoner. Knowing that they could be observed at any time, and not knowing when, the prisoners had to assume that the guard was watching them at all times, and behave accordingly. In essence, the prisoners guarded themselves. In addition,each inmate was isolated and confined to their cells. This isolation prevented any collective action against the small number of prison guards, and meant that there were no issues to deal with between inmates (Bentham 1843, 60–64).

Whilst the panopticon metaphor is appealing at first glance, it is flawed as a metaphor for corporate surveillance. Corporations may indeed collect data on our behaviour largely unnoticed, and then use this data to give us individualised (or ‘personalised’) experiences. However, Foucault identified ‘the major effect of the panopticon’ to be ‘to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power’ (1979, 201). The effect of the Panopticon relied on the inmates knowing that they could be observed at any time, but corporations aren’t shouting from the rooftops about the data that they collect on us. For example, Google became incredibly evasive when a hidden microphone was found in their Nest home security system (Telford 2019). They don’t want us to know about what data they collect, and how valuable it is, because otherwise their business model would collapse.

Ignorance is essential to the surveillance capitalist business model

Simply put, the business model of online corporate surveillance is to collect the waste data from our lives, and to use this data to ‘predict and modify human behaviour as a means to produce revenue and market control’ (Zuboff 2015). This business model is known as surveillance capitalism.

The waste data is known as behavioural surplus. Corporations collect more data than they need, to serve us with what we need, for example, results to our search query (Zuboff 2019, 14). They even collect more data than they need to improve their products and services. They might collect: common misspellings of a search term, how fast you type, your location, which colour buttons you prefer, and many more.

This behavioural surplus is fed into machine learning models to predict our behaviour. Corporations also run experiments on us, to find out what they can do to nudge and herd behaviour towards profitable outcomes. In Zuboff’s words they not only ‘automate information flows about us; the goal is now to automate us’ (Zuboff 2019, 14, emphasis as in original).

Consumers’ ignorance is essential to this business model. If we knew the value of our behavioural surplus, we might not give it away so easily. If we were aware that the experiments were to change behaviour towards profit-making outcomes, we might choose to behave differently. If we realised that corporations were attempting to ‘automate us’ we might take steps to resist this automation.

The enlightened and able opt-out of behavioural advertising

The Digital Advertising Agency launched the AdChoices program in 2009 to en able consumers to opt-out of behaviourally targeted advertising. Behaviourally targeted advertising being where ads are served to you based on these predictive models and your behavioural surplus. Signing up to AdChoices is rather like adding oneself to the Federal Trade Commission’s ‘Do Not Call’ website, to blanket opt-out from marketing calls. Each advertisement contains a clickable AdChoices icon, this icon takes the person to information on behavioural advertising before inviting them to opt-out of behavioural advertising.

The AdChoices process, TRUSTe (2016)

Johnson, Shriver, and Du (2020) conducted an observational study to determine who opts-out of behavioural advertising in the United States. They found that, in contrast to survey data which suggests that approx. 66% of consumers disapprove of behaviourally targeted advertising (Purcell, Brenner, and Rainie 2012; McDonald and Cranor 2010; Turow et al. 2009), only 0.23% of advertisement impressions in the US were from opt-out users. Within the literature, there is much discussion over the privacy paradox, where people claim to care about privacy, but don’t act like they care. Some research suggests that the paradox exists (Lee, Park, and Kim 2013; Norberg, Horne, and Horne 2007), whilst others claim it does not (Martin 2020; Tsai et al. 2010).

In the case of AdChoices, ignorance of the scheme plays a pivotal role in the low opt-out rates. Johnson, Shriver, and Du (2020) report that people often seemed to confuse AdChoices with adware1. Eight of the top nine AdChoices related search terms were asking about how to ‘remove AdChoices’, and the top search results explained how to remove malware (Johnson, Shriver, and Du 2020). This reflects how people lacked awareness of the purpose of the AdChoices scheme. Johnson, Shriver, and Du (2020) found that opt-out rates decreased when awareness of the AdChoices icon was low.

It’s clear, however, that had people understood what AdChoices was, they would have been more likely to opt-out. In contrast to the 5.4m (1.7%) of the US population who have opted-out through the AdChoices scheme, the analogous ‘Do Not Call register’ has 72% of Americans registered (Bush 2009). The top two ad blockers on Chrome each has over 20 million daily users, and the largest on Firefox has 19 million (Johnson, Shriver, and Du 2020). In the United States, ad impression blocking rates were at 16% (PageFair and Adobe 2015), which is 70 times the AdChoices opt-out rates (Johnson, Shriver, and Du 2020).

People do take steps to protect themselves from corporate surveillance, but because of people’s ignorance, these are sometimes ineffective. A survey by Lewis (2017) found that 89% of people had taken steps to protect their privacy, including using private browsing. However two-thirds of private browsing users overestimated the protection that it gave them [DuckDuckGo (2017); ]. Approximately 40% of private browsing users thought that private browsing prevented advertising from behaviourally tracking them (DuckDuckGo 2017). Users reported feeling ‘misled’, ‘confused’ and ‘vulnerable’ when they found that this was not, in fact, the case (DuckDuckGo 2017).

Whilst people would like to protect their privacy, they do not have the knowledge and money required to do so. Lewis (2017) found that few people used some of the more privacy-protective steps (such as using a VPN), and that these were less familar to average people and costlier to adopt. This is consistent with the finding from Johnson, Shriver, and Du (2020) that those opting-out from AdChoices tended to be more technologically sophisticated, and living in areas that are older and wealthier. Acquisti, Brandimarte, and Loewenstein (2020) explain how each higher degree of protection people seek requires a vast amount more knowledge and financial cost. They say ‘Attempting to hide most of one’s digital footprints from third-party monitoring is nearly incalculably demanding.’

Advertisers would be willing to pay you for the right to
track you

Behavioural advertising is sold by an ad exchange who operate real-time auctions. In a split second when you are loading your web page, advertisers are buying the right to advertise to you. This sales method means advertisers can assess, from previous behavioural data, who it’s worth them advertising to, and how much they’d like to pay.2

Johnson, Shriver, and Du (2020) found that adverts being served to people who had opted out through AdChoices were worth 52% less on the ad-exchange than people who had not. This is consistent with other research which has found that people with no behavioural data are worth around two-thirds less to advertisers (Beales and Eisenach 2014; Goldfarb and Tucker 2011). This means that the advertising industry would be willing to pay roughly US $8.58 (2015 prices) to each person per year, to be allowed to track them.

Acquisti, Brandimarte, and Loewenstein (2020) explain how little research has been done into how people might benefit or lose out from targeted advertising. People might for example have lower search costs, might be offered different quality products, or products at different prices. However, having identified different groups of people with different amounts that they are willing to pay, a corporation might sell to these different people at different prices. This is known as price discrimination and it is already happening (Hannak et al. 2014; Mikians et al. 2013).

Perfect price discrimination, where a corporation charges each person their exact willingness to pay is bad for people. People are no longer gaining from trade, and the gain they would have otherwise have had, is instead transferred to the corporation (Zuiderveen Borgesius and Poort 2017). When a consumer is willing to pay more than they are charged, they gain the difference from trading. The corporation also gains when the price paid is greater than the lowest price they were willing to accept. Behavioural data and machine learning gives corporations the means to predict people’s willingess to pay, and in doing so gain more profit, at the expense of people.

Overcoming ignorance by sharing knowledge and working together

Here, I have considered how privacy-conscious people face a steep hurdle of ignorance to taking control of their behavioural data and asserting their rights to privacy. It is evident that there already exist privacy enhancing technologies such as AdChoices, browser plug-ins Ghostly and Privacy Badger, and VPNs. However, too few people are aware of, and able to take advantage of them. This knowledge, and these opportunities need to be shared widely across society. We need to work with average users, to see things from their point of view, and help create tools and interventions that work for them.

Acquisti, Brandimarte, and Loewenstein (2020) conclude that current regulatory efforts are failing because privacy is not a ‘hot-button issue’. Whilst people are concerned about privacy when they’re directly asked, they’re unlikely to bring it up in an open question about what issues are important. It’s time to share our knowledge of the value of data, the surveillance capitalism that seeks to automate us, and what we can do about it.

Bentham’s prison guards kept inmates as separate and isolated individuals because they feared collective action. It’s only by sharing knowledge and working together, that we can hope to protect everyone’s right to privacy in the age of corporate surveillance.

References

  • Acquisti, Alessandro, Laura Brandimarte, and George Loewenstein. 2020. “Secrets and Likes: The Drive for Privacy and the Difficulty of Achieving It in the Digital Age.” Journal of Consumer Psychology 30 (4): 736–58. https://doi.org/10.1002/jcpy.1191.
  • Ākāśha, Blue. 2019. Plan of the Jeremy Bentham’s Panopticon Prison. https://www.pinterest.it/pin/48976714676422423/. https://commons.wikimedia.org/wiki/File:Panopticon_prison.jpg.
  • Bartlett, Jamie. 2018. The People Vs Tech: How the Internet Is Killing Democracy (and How We Save It). Random House.
  • Beales, Howard, and Jeffrey A. Eisenach. 2014. “An Empirical Analysis of the Value of Information Sharing in the Market for Online Content.” SSRN Scholarly Paper. Rochester, NY. https://doi.org/10.2139/ssrn.2421405.
  • Bentham, Jeremy. 1843. The Works of Jeremy Bentham. W. Tait.
  • Bush, GW. 2009. “Economic Regulation. Chapter 9. White House Archives.”
  • Campbell, John Edward, and Matt Carlson. 2002. “Panopticon.com: Online Surveillance and the Commodification of Privacy.” Journal of Broadcasting & Electronic Media 46 (4): 586–606. https://doi.org/10.1207/s15506878jobem4604_6.
  • DuckDuckGo. 2017. “A Study on Private Browsing: Consumer Usage, Knowledge, and Thoughts.” Technical report. DuckDuckGo. https://duckduckgo.com/download/Private_Browsing.pdf.
  • Foucault, Michel. 1979. Discipline and Punish: The Birth of the Prison. Translated by Alan Sheridan. Book, Whole. Harmondsworth (etc.): Penguin. https://go.exlibris.link/C2BVds27.
  • Goldfarb, Avi, and Catherine E. Tucker. 2011. “Privacy Regulation and Online Advertising.” Management Science 57 (1): 57–71. https://doi.org/10.1287/mnsc.1100.1246.
  • Hannak, Aniko, Gary Soeller, David Lazer, Alan Mislove, and Christo Wilson. 2014. “Measuring Price Discrimination and Steering on E-Commerce Web Sites.” In Proceedings of the 2014 Conference on Internet Measurement Conference, 305–18. IMC ’14. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/2663716.2663744.
  • Johnson, Garrett A., Scott K. Shriver, and Shaoyin Du. 2020. “Consumer Privacy Choice in Online Advertising: Who Opts Out and at What Cost to Industry?” Marketing Science 39 (1): 33–51. https://doi.org/10.1287/mksc.2019.1198.
  • Kosta, Eleni, Christos Kalloniatis, Lilian Mitrou, and Evangelia Kavakli. 2011. “The ‘Panopticon’ of Search Engines: The Response of the European Data Protection Framework.” Requirements Engineering 16 (1): 47–54. https://doi.org/10.1007/s00766-010-0107-7.
  • Lee, Haein, Hyejin Park, and Jinwoo Kim. 2013. “Why Do People Share Their Context Information on Social Network Services? A Qualitative Study and an Experimental Study on Users’ Behavior of Balancing Perceived Benefit and Risk.” International Journal of Human-Computer Studies, Social Networks and Ubiquitous Interactions, 71 (9): 862–77. https://doi.org/10.1016/j.ijhcs.2013.01.005.
  • Lewis, Brionna. 2017. “Americans Say Data Privacy Is Important, but Few Take Steps to Protect Themselves.” Instamotor (blog). November 7, 2017. https://instamotor.com/blog/online-data-privacy-survey.
  • Martin, Kirsten. 2020. “Breaking the Privacy Paradox: The Value of Privacy and Associated Duty of Firms.” Business Ethics Quarterly 30 (1): 65–96. https://doi.org/10.1017/beq.2019.24.
  • McDonald, Aleecia, and Lorrie Faith Cranor. 2010. “Beliefs and Behaviors: Internet Users’ Understanding of Behavioral Advertising.” SSRN Scholarly Paper. Rochester, NY. https://papers.ssrn.com/abstract=1989092.
  • Mikians, Jakub, László Gyarmati, Vijay Erramilli, and Nikolaos Laoutaris. 2013. “Crowd-Assisted Search for Price Discrimination in e-Commerce: First Results.” In Proceedings of the Ninth ACM Conference on Emerging Networking Experiments and Technologies, 1–6. CoNEXT ’13. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/2535372.2535415.
  • Norberg, Patricia A., Daniel R. Horne, and David A. Horne. 2007. “The Privacy Paradox: Personal Information Disclosure Intentions Versus Behaviors.” Journal of Consumer Affairs 41 (1): 100–126. https://doi.org/https://doi.org/10.1111/j.1745-6606.2006.00070.x.
  • PageFair, and Adobe. 2015. “The Cost of Ad Blocking: PageFair and Adobe 2015 Ad Blocking Report.” Technical report. PageFair and Adobe. https://blog.pagefair.com/2015/ad-blocking-report/.
  • Purcell, Kristen, Joanna Brenner, and Lee Rainie. 2012. “Search Engine Use 2012.” Technical report. Pew Research Center, Washington, DC. https://www.pewresearch.org/internet/2012/03/09/search-engine-use-2012/.
  • Telford, Taylor. 2019. “Google Failed to Notify Customers It Put Microphones in Nest Security Systems.” Washington Post, February 21, 2019. https://www.washingtonpost.com/business/2019/02/20/google-forgot-notify-customers-it-put-microphones-nest-security-systems/.
  • TRUSTe. 2016. “Solutions Brief: Ads Privacy Compliance.” Technical report. TRUSTe.
  • Tsai, Janice Y., Serge Egelman, Lorrie Cranor, and Alessandro Acquisti. 2010. “The Effect of Online Privacy Information on Purchasing Behavior: An Experimental Study.” Information Systems Research 22 (2): 254–68. https://doi.org/10.1287/isre.1090.0260.
  • Turow, Joseph, Jennifer King, Chris Jay Hoofnagle, Amy Bleakley, and Michael Hennessy. 2009. “Americans Reject Tailored Advertising and Three Activities That Enable It.” SSRN Scholarly Paper. Rochester, NY. https://doi.org/10.2139/ssrn.1478214.
  • Zuboff, Shoshana. 2015. “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization.” Journal of Information Technology 30 (1): 75–89. https://doi.org/10.1057/jit.2015.5.
    ———. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.
  • Zuiderveen Borgesius, Frederik, and Joost Poort. 2017. “Online Price Discrimination and EU Data Privacy Law.” Journal of Consumer Policy 40 (3): 347–66. https://doi.org/10.1007/s10603-017-9354-z.
  1. Adware is a form of malware that serves advertisements. ↩︎
  2. This is in addition to advertisers being able to contextually target you, based on present browsing behaviour (e.g., the current site you’re visiting, time of day, your rough location from your IP address). ↩︎