By Armelle Skatulski
September 23 2020

Today, it is well established that worker and consumer data are sold as goods[1] in digital value chains controlled by data monopolies and used for the purpose of predictive analytics and automated decisions. Critically, algorithms power the collection and processing of such data while needing to feed on large datasets to train themselves in a kind of (in)virtuous cycle. This means that privacy is exponentially at risk, sometimes becoming locked into the mechanisms of value capture.[2] While foregrounding that the right to privacy is not an economic good,[3] data regulation debates are increasingly concerned with the risks associated with the use of algorithms in automated decision-making (ADM), such as applied in the ranking and selection of job seekers through profiling.[4] The EU 2018 General Data Protection Regulation (GDPR) is ground-breaking[5] in addressing both the processing of personal data and automated decision-making (ADM) (including profiling).[6] As the UK exits the EU, there remains a question as to whether the rights and safeguards inscribed in the GDPR will be preserved. What forms of digital governance, individual and collective, should be retained and advocated for?
The Keeling Schedules
The recent A-level grading algorithm scandal is an alarming example of the discriminatory potential of ADM while at the same time exemplifying the potential safeguarding power of data regulation.[7] When the GDPR became law in all EU member states in May 2018, the UK Data Protection Act (DPA) was amended to reflect this.[8] The GDPR remains in force in the UK until at least 31 December 2020, the end of the UK transition period in exiting the EU. The UK government has published its ‘Keeling schedules’ indicating planned amendments to the Data Protection Act (DPA) and a ‘UK GDPR’ under new regulations (subject to parliamentary approval).[9] According to the Information Commissioner’s Office (ICO) “key principles, rights and obligations will remain unchanged,” while there will be implications for the transfer of personal data between the UK and the European Economic Area (EEA).[10] However, amendments to the DPA will require attentive scrutiny from policy makers and advocacy and research organisations in order to ensure that the current data rights of UK citizens and residents, as enshrined in the GDPR, be preserved.
The need for data regulation is clear
In a data-driven economy,[11] almost every aspect of employment is affected by the collection and processing of data. From recruitment, to workplace monitoring, to work exiting.[12] Techniques used to profile consumers have infiltrated the world of work to track, profile, and score employees.[13] Beyond digital surveillance,[14] the datafication of employment has fuelled excessive capital gains made from the rent of worker data, [15] especially behavioural insights used for statistical predictions.[16] In a ‘scored society’,[17] algorithms have become powerful tools. The convergence of Big data analytics with machine learning (ML) has ushered an age of ‘algorithmic governance.’[18] Besides tracking productivity, algorithms now can interview, filter and ‘select’ job seekers, predict attrition rates and even ‘signal’ possible dismissal.[19]
While individuals are now given the opportunity to choose how their data is handled, for instance online via cookies preferences, they may not be informed about how private companies will create digital profiles[20] from their data footprints. Profiles can then be used for future inferences about them (such as political affiliation, gender, race) or to make ‘decisions’ about access to services both in the private and public sectors (such as bank loans, insurance policies, or state services).[21]
Algorithm-led processes and decisions potentially expose job seekers, workers and consumers to discriminatory effects. These are often caused by historical data (i.e. the data on which algorithms are trained) reflecting existing societal bias[22] or algorithms inferring data through correlations (such as deducting race from postal codes).[23] Hence, even if it is prohibited to collect information about trade union membership, religion, gender, sexual orientation or disability, these can be inferred through pattern recognition.[24]
GDPR: untapped potential?
In this environment of extreme data-extraction and – arguably – overdependency, how can existent GDPR legislation help? There is a common assumption that GDPR is relevant predominantly when interacting with websites’ privacy notices or mailing list subscriptions; but its real value lies in the protections it can offer individuals from the influence of data-driven decision-making. The latter has been identified as the most pressing issue at the heart of AI-based automation’s reach in society.[25] In the EU, the range and scope of applications of automated decisions are such that they affect almost all fields of life, from healthcare services allocation, to welfare fraud detection, recruitment, predictive policing, to border control.[26]
The GDPR offers protections from automated decision-making in the form of what can be referred to as a ‘right to object’ or ‘right to contest’ (Article 22) and a ‘right to explanation’ (Articles 13-15).[27] Article 22(1) of the GDPR establishes the right for a person ‘not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.’[28] Importantly, the Article 29 Working Party (WP29)[29] insisted that trivial human intervention will not count as significant.[30] Critically, as Wachter et al. point out, data subjects are granted ‘rights to express their views, contest decisions, and obtain human intervention on the part of data collectors.’[31] GDPR allows us to assess and challenge ADM and profiling based on inferences.[32]
GDPR and the employment relationship
The datafication of employment is inscribed in a value chain where data accumulated about workers to produce behavioural, predictive insights are owned and sold by data collectors. This sets up an ever increasing ‘informational asymmetry’ in the employment relation[33] and establishes inequitable terms for data ownership, analysis, and use, that have become the ‘norm’: the concentration of data control and monetisation in the hands of data collectors and digital dispossession for others (workers and users).
Digital inequity is now coupled with automated processes that make workers more defenceless as accountability for such processes can be difficult to ascertain. This is due to the fact that algorithms are privately developed and ‘black-boxed,’ i.e. shrouded in secrecy,[34] while their operations may be too opaque to non-experts. Of concern, for instance, is the practice of profiling job seekers, which leads to automated scoring and filtering.[35] A candidate may be denied an interview based on bias inscribed in the data.[36] Scoring may also be used to exclude workers from access to work on digital platforms.[37] While Article 22(1) does not explicitly refer to the employment relation, it can be deemed applicable to it,[38] where personal data processing is concerned.
Notice-and-consent mechanisms, the employment relationship and their discontents
Interestingly, there are exceptions to the prohibitive power of ‘the right to object’ in the GDPR legislation: the need to fulfil a contract; if authorised by the EU member law; or in the case of consent given by the data subject (and only if the ADM will not be based on the process of protected data).[39] In such cases, certain safeguards for data subjects must apply, including ‘the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.’[40] While consent may seem empowering for consumers/users, the legal value of consent in the context of employment has been questioned. In its guidance on consent, the WP29 points out that given the power imbalance of the working relation, ‘the granting of consent’ by the employee should not be deemed valid.[41] As Ajunwa et al. observe, notice-and-consent mechanisms ‘merely serve as a sanitizing seal of approval for employer surveillance’ [42] and consequently for asymmetrical data collection. In this way, just as it tries to provide protections suitable to the datafication of work, GDPR legislation reveals something intractable about the employment relationship itself: consent in this context is tainted by the coercion that comes with the ‘private government’ and labour market imperative that is waged work.[43] There is, we might say, no true autonomy of will in the act of consent at work.[44]
Consent – and its troubled relationship with modern work – may therefore be one of the most critical and contentious areas to reform, through, for instance, an amendment to the (UK) GDPR that would relate specifically to the employment relation. What new capacities might this unlock for employees to push back against the increasingly granular surveillance that many workplaces are adopting? A 2018 TUC survey of over two thousand workers found that 70% believed that surveillance will become more common in the future and two-thirds of workers (66%) believed that unless it is properly regulated, such technology could be used to discriminate unfairly. Very few workers (38%) said that they would be able to challenge workplace monitoring. If these fears are born out by reality, then appropriate mechanisms of resistance must be formulated.
Better awareness of the safeguards and rights afforded by the GDPR must therefore be widely promoted among the working and wider population in order to counter the opaqueness of firm-level data gathering. Workers (and consumers) need to be empowered by the knowledge that such legislation grants them the right to know how their personal data is used and to challenge its processing for the purpose of profiling or that of automated decisions. GDPR – as limited as it may be – grants workers valuable rights and protections in the face of the datafication of employment while their bargaining power regarding the yielding of privacy by consent requires additional safeguards. In the context of a UK exit from the EU, policy makers, and advocacy and research organisations must ensure existent rights are protected and advanced.
References
[1] See Datarade, a directory of data marketplaces, for a showcase of the expanse and variety of data marketplaces: https://about.datarade.ai/data-marketplaces
[2] UNCTAD, “Digital Economy Report 2019 – Value Creation and Capture: Implications for Developing Countries,” 4 Sept. 2019, Geneva: United Nations Conference on Trade and Development, p. 29.
[3] Ajunwa et al. base their analysis of data regulation from “the normative viewpoint that the right to privacy is not an economic good that may be exchanged for the opportunity for employment.” In Ajunwa I., Crawford K and Schultz J., “Limitless worker surveillance,” California Law Review 105(3), 2017, p. 102.
[4] Algorithmic accountability has been identified as an important emergent field of research by AlgorithmWatch in their report “Automating Society – Taking Stock of Automated Decision-Making in the EU,” 2019. Accessed at: <https://algorithmwatch.org/en/project/automating-society/>
[5] Brkan, M., & Bonnet, G., “Legal and Technical Feasibility of the GDPR’s Quest for Explanation of Algorithmic Decisions: Of Black Boxes, White Boxes and Fata Morganas,” European Journal of Risk Regulation, 11(1), 2020, p.19. doi: 10.1017/err.2020.10.
[6] Art. 5 GDPR, “Principles relating to processing of personal data,” <https://gdpr.eu/article-5-how-to-process-personal-data/>; and Art. 22 GDPR, “Automated individual decision-making, including profiling,” <https://gdpr.eu/article-22-automated-individual-decision-making/>.
[7] According to the legal firm Foxglove, who had launched legal proceedings against the UK government over their use of the algorithm, “Ofqual’s algorithmic grading exceeded its statutory powers, violated key principles of data protection law” and “Ofqual’s process was unfair and unlawful.” Foxglove, “Press Release: U-turn on A-level algorithm,” 17 August 2020. Accessed at: <https://www.foxglove.org.uk/news/press-release-u-turn-on-a-level-algorithm>
[8] Data Protection Act 2018, UK Parliament, <https://services.parliament.uk/bills/2017-19/dataprotection.html>
[9] For a draft of the planned changes to the Data Protection Act 2018 by the Data Protection, Privacy and Electronic Communications (Amendments etc.) (EU Exit) Regulations 2019 (when they come into force), see <https://www.gov.uk/government/publications/data-protection-law-eu-exit>
[10] Information Commissioner’s Office, “Data Protection at the end of the transition period.” Accessed at: <https://ico.org.uk/for-organisations/data-protection-at-the-end-of-the-transition-period/data-protection-at-the-end-of-the-transition-period/>
[11] For an overview of the concept of ‘digital economy’ and of the latest trends in digital value capture, see UNCTAD. Digital Economy Report 2019 – Value Creation and Capture: Implications for Developing Countries. 2019. Geneva: United Nations Conference on Trade and Development, p. 4.
[12] Work exiting is of particular concern for platform workers, who despite generating a lot of data, do not have access to it as it is owned by platforms. This prevents them from demonstrating their job performance to future employers. Adler-Bell, Sam and Miller, Michelle, “The Datafication of Employment: How Surveillance and Capitalism Are Shaping Workers’ Futures without Their Knowledge,” The Century Foundation, 19.12.2018, p. 15. Accessed at: <https://tcf.org/content/report/datafication-employment-surveillance-capitalism-shaping-workers-futures-without-knowledge/>
[13] This is referred to as the market of “people analytics,” ibid.
[14] For an overview of surveillance in the workplace see Ajunwa I., Crawford K. and Schultz J. (2017).
[15] The cliché of a Data Gold Rush has been frequently used to describe the type of value capture which has benefited Silicon Valley data empires. See for instance: Brad Peters, “The Big Data Gold Rush,” Forbes, 21.06.2012. Accessed at: <https://www.forbes.com/sites/bradpeters/2012/06/21/the-big-data-gold-rush/#2427d0dab247>; but also more recently in reference to hedge funds’ investments more broadly, Lindsay Fortado, Robin Wigglesworth and Kara Scannell, “Hedge funds see a gold rush in data mining,” Financial Times, 28.07.2017. Accessed at: <https://www.ft.com/content/d86ad460-8802-11e7-bf50-e1c239b45787>
[16] See UNCTAD. 2019, p. 8; On ‘behavioural surplus’ and futures markets see Zuboff, S. The Age of Surveillance Capitalism, Profile Books, 2019.
[17] Citron, Danielle Keats and Pasquale, Frank A., “The Scored Society: Due Process for Automated Predictions,” Washington Law Review, Vol. 89, 2014, p. 1-33, University of Maryland Legal Studies Research Paper No. 2014-8. Accessed at: <https://ssrn.com/abstract=2376209>
[18] The concept first appears in the analysis of outsourcing in Aneesh, A., “Global Labor: Algocratic Modes of Organization,” Sociological Theory, 27(4), 2009, pp. 347–370. https://doi.org/10.1111/j.1467-9558.2009.01352. Rob Kitchin refers to ‘algorithmic management’ in Kitchin, R., “Thinking critically about and researching algorithms,” Information, Communication & Society, 20:1, 2017, p. 15, https://doi.org/10.1080/1369118X.2016.1154087; in the context of work see: Katherine C. Kellogg, Melissa A. Valentine, Angèle Christin, “Algorithms at Work: The New Contested Terrain of Control,” Academy of Management Annals 14:1, 2020, pp. 366-410. <https://journals.aom.org/doi/abs/10.5465/annals.2018.0174>
[19] Todolí-Signes has speculated on the possible application of ‘if then (apply Y)’ algorithmic rules that could implement automated forms of dismissal. Todolí-Signes, A., “Algorithms, artificial intelligence and automated decisions concerning workers and the risks of discrimination: the necessary collective governance of data protection,” Transfer: European Review of Labour and Research, 25(4), 2019, p. 478. https://doi.org/10.1177/1024258919876416.
[20] Ferraris, Valeria and Bosco, Francesca and Cafiero, G. and D’Angelo, Elena and Suloyeva, Y., Defining Profiling (December 11, 2013). Accessible at: <http://dx.doi.org/10.2139/ssrn.2366564 u>
[21] McCann, Duncan and Hall, Miranda, “Digital Self-Control: Algorithms, accountability, and our digital selves,” NEF, 25.03.2019, pp. 9-10. Accessed at: <https://neweconomics.org/2019/03/digital-self-control>.
[22] Dastin, J. “Amazon scraps secret AI recruiting tool that showed bias against women,” Reuters, 10 October 2018. Accessed at: <https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G>
[23] Hardt M (2014) How big data is unfair. Medium 2014. Accessed at: <https://medium.com/@mrtz/how-big-data-is-unfair-9aa544d739de>
[24] Crawford K. and Schultz J., “Big data and due process: Towards a framework to redress predictive privacy harms,” Boston College Law Review 55(1), 2014: 93–128.
[25] AlgorithmWatch, “Automating Society – Taking Stock of Automated Decision-Making in the EU,” January 2019, p. 19. Accessed at: <https://algorithmwatch.org/en/automating-society/>
[26] Ibid, p. 8.
[27] For a detailed analysis of a right to explanation, see Selbst A.D. and Powles J., “Meaningful information and the right to explanation,” International Data Privacy Law 7(4), 2017: 233–242.
[28] Art. 22 GDPR, “Automated individual decision-making, including profiling,” <https://gdpr.eu/article-22-automated-individual-decision-making/>
[29] The Article 29 Working Party (Art. 29 WP) was the independent European working party that dealt with issues relating to the protection of privacy and personal data until the application of the GDPR. See: <https://edpb.europa.eu/our-work-tools/article-29-working-party_en>
[30] In Selbst AD and Powles J., 2017, p. 235.
[31] Wachter, Sandra and Mittelstadt, Brent, “A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI,” (October 5, 2018). Columbia Business Law Review, 2019(2): 78. Accessed at: <https://ssrn.com/abstract=3248829>
[32] Many have debated the scope of the GDPR as it applies to ADM based on the processing of personal data only, and does not protect us, for instance, in the case of predictive policing ADM which can be applied without recourse to the latter but sill affect communities and individuals in serious ways (AlgorithmWatch, 2019, p. 17).
[33] The Century Foundation, 2018, p.2.
[34] Pasquale, F., The Black Box Society: The Secret Algorithms That Control Money and Information, Cambridge, Massachusetts; London, England: Harvard University Press, 2015. Accessed at: < www.jstor.org/stable/j.ctt13x0hch>
[35] On e-recruiting practices without human intervention, see ICO’s guidance notes, < https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling/what-does-the-gdpr-say-about-automated-decision-making-and-profiling/>
[36] Mann, Gideon and O’Neil, Cathy, “Hiring Algorithms Are Not Neutral,” Harvard Business Review, 09.12.2016. Accessed at: <https://hbr.org/2016/12/hiring-algorithms-are-not-neutral?registration=success>
[37] ‘For example, the transport company Lyft has a rule whereby if a driver has an average rating below 4.6 (out of 5) he or she is automatically deactivated.’ Todolí-Signes, A., 2019, p. 468.
[38] Todolí-Signes, 2019, p. 473.
[39] Article 22(2) and Art. 9 of the GDPR, < https://gdpr.eu/article-22-automated-individual-decision-making/>.
[40] Selbst AD and Powles J., 2017, pp. 234-35.
[41] WP29 (Article 29 Data Protection Working Party) (2018) Guidelines on transparency under Regulation 2016/679. Revised and Adopted 11 April 2018. Brussels: European Commission, in Todolí-Signes, A., 2019, p. 473.
[42] Ajunwa I, Crawford K and Schultz J., 2017, p.774.
[43] Anderson, E. Private Government. Princeton: Princeton University Press, 2017.
[44] See Todolí-Signes, A., 2019, p. 473 and Ajunwa et al., 2017, p. 141.

Armelle Skatulski is a doctoral researcher at the Royal College of Art. Her research considers the (work) accident as an economic problem from a bio- political perspective and traces procedures that disqualify the abnormality of death in the context of work. As part of her partnership with Autonomy, Armelle will research processes leading to the expansion and normalisation of work casualisation and precarity in the contemporary workplace.