![]() |
Home
| Databases
| WorldLII
| Search
| Feedback
Law, Technology and Humans |
Book Review
Virginia Eubanks (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: Picador, St Martin’s Press
Faith Gordon*
Law, Technology and Humans Book Review Editor
ISBN: 9781250215789
It is difficult to imagine a future in which big data and artificial intelligence will not be prominent. Since the dawn of the digital age, the decision-making processes in employment, finance, politics, health and human services have ‘undergone revolutionary change’.[1] Previous decisions on offering opportunities like employment, insurance and government services were made by humans; however, today ‘much of that decision-making power’ for outcomes that significantly shape lives has been given to ‘sophisticated machines’.[2] As decision-makers become more dependent on big data analytics, people’s privacy and freedom often become more threatened. Additionally, this dependency amplifies the ‘digital divide’[3] or, as Eubanks calls it, ‘the digital poorhouse’, yet, little or no political debates or discussions are taking place surrounding the negative consequences.[4]
There is a growing body of research and literature on automated decision-making, algorithmic accountability and the processes evolving as new forms of ‘digital discrimination’.[5] The inequality in these decision-making processes and the discriminatory outcomes are the core issues explored in Virginia Eubanks’s groundbreaking new book: Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. Eubanks builds on her previous work on technology and social justice by exploring the challenges and consequences of decision-makers giving machines the power to make decisions about human needs, public benefits and state interventions in the United States (US). The author argues that the same digital tracking that operates in ‘low rights environments’ in which ‘poor and working class’ individuals have ‘few expectations of political accountability and transparency’ is ‘used on everyone’ (p. 12).
Eubanks takes the reader on a journey, which is reflected in the structure of the book and the pages of firsthand accounts on the realities of the ‘digital poorhouse’.[6] The book is structured neatly into an introduction, five chapters and a conclusion and draws on both documentary analysis and qualitative interviews. The introduction outlines the author’s first direct contact with ‘organizations working closely with families most directly impacted by the systems’ she explores.[7] Eubanks introduces the context of the issue—the ‘complex integrated databases collect ... personal information’, the predictive models and algorithms ‘tag’ people as ‘risky’ and ‘problematic’ and then law enforcement and other agencies can conduct surveillance.[8] She sensitively documents the lived experiences of those who ‘are targeted by new tools of digital poverty management and face life-threatening consequences as a result’.[9] The data analysis indicates that ‘these new systems have the most destructive and deadly effects in low-income communities of color’ and ‘impact poor and working-class people across the color line’.[10] Eubanks asserts that such technological innovations ‘hide poverty from the professional middle-class public and give the nation the ethical distance it needs to make inhuman choices’.[11]
Such ‘inhuman choices’ deeply affect the lives of those most marginalised individuals, with many poignant examples featuring in Chapters one to five of the book, particularly in Indiana, California and Pennsylvania. Chapter One describes the introduction of a computerised registry for ‘every welfare, Medicaid, and food stamp recipient in the state’ and the Temporary Assistance for Needy Families program ‘put into effect a wide array of sanctions to penalize noncompliance’.[12] The chapter details how these new measures are ‘an expansion and continuation of moralistic and punitive poverty management strategies’ that have been in operation since the 1820s in the US. Eubanks convincingly argues that this use of technology, combined with restrictive ‘new rules’, has effectively ‘reversed the gains of the welfare rights movement’.[13] While the rhetoric of ‘the digital poorhouse’ is effectively ‘framed as a way to rationalize and streamline benefits’, in reality, the author argues that ‘the real goal’ has always been ‘to profile, police, and punish the poor’.[14]
Chapter Two introduces Sophie Stipes—the person to whom the book is dedicated. Sophie was born in 2002 and shortly afterwards was diagnosed with a range of disabling conditions, which ‘without Medicaid ... would have been financially overwhelming’.[15] At age six, Sophie received a letter stating that she would no longer receive Medicaid as she had ‘failed to cooperate’ by establishing her eligibility for the program.[16] The delay in receiving the letter resulted in Sophie’s family having ‘three days left’ to contact the agency, though they had not been informed about the necessary paperwork and were not given time to address or challenge the decision.[17] Interviews with advocates refer to Sophie’s case as ‘particularly appalling’.[18] The traditional model of face-to-face service was replaced by a digitally automated system with no designated human caseworker, leaving individuals like Sophie vulnerable to further isolation and hardships. Eubanks detailed how media pressure and a meeting with Lawren Mills, Governor Daniels’ policy director for human services, led to the return of Sophie’s Medicaid.[19] However, as the chapters that follow illustrate, Sophie’s case and the Stipes family’s experience is not an isolated one because these systems designed by humans are established with the goal of reducing benefit claiming wherever possible.
Chapters three to five utilise several people’s lived experiences to demonstrate that traditional judgemental and stereotypical assumptions about the working class and those experiencing poverty remain key drivers in the human design of algorithmic decision-making aids. For example, the Allegheny Family Screening Tool predicts which children may need the intervention of social services agencies. This system assesses the risk of a child being abused or neglected, positioning them on a scale of one to two. The ‘predictive’ risk assessment combines information on schooling, criminal justice, health, family services and other data from children’s lives and aggregates this into a multi-agency database.[20] The assessment number is then utilised when deciding whether to intervene. Eubanks’s analysis exposes how the system’s tool generates data from past events, including the childhood of a parent or grandparent, to aid decisions on conducting future surveillance and interventions in the lives of families.[21] In addition to punishing contemporary families for circumstances experienced intergenerationally, the author examines the alarming levels of racially biased data within the system, which is reflective of traditionally racist attitudes towards African Americans and working class individuals.[22] Converse to the common assumption that algorithmic decision-making aids produce fairer outcomes, the main body of the book demonstrates how the decision-making aspects designed, built and programmed by humans often result in permanent and fixed notions based on traditional biases.
Eubanks exposes how the traditional bias and endemic targeting of those of a particular class and/or race unfolds in new algorithmic decision-making aids; however, a minor aspect that requires further attention is the use of technology for social good and social change (e.g., the work of McNutt).[23] While the author references social movements and campaigns, such as Occupy Wall Street and Black Lives Matter,[24] more could have been drawn out in the analysis, particularly reflecting on the use of technology for social good and its potential utilisation as a tool of resistance.
Eubanks’s conclusion calls for a ‘dismantling’ of ‘the digital poorhouse’ and acknowledges that altering ‘cultural understandings and political responses to poverty will be difficult, abiding work’; however, technological development surges on and will not wait for our new stories and visions to emerge.[25] Eubanks’s key argument is that ‘we need to develop basic technological design principles to minimize harm’.[26] The author poses a series of poignant questions, as well as a first draft of a Hippocratic Oath for data scientists, systems engineers, hackers and administrative officials that centres around the ‘non-harm’ principle.[27] Eubanks asserts that ‘our ethical evolution still lags behind our technological revolutions’ and the ‘digital revolution has warped to fit the shape’ of what is still an ‘inequitable world’ because society has failed to address the ‘crucial challenges’ of ‘dismantling racism and ending poverty’.[28]
Automating Inequality makes a timely and significant contribution to the social justice field. The book exposes the often hidden, yet extremely damaging social implications of technological developments in the US. The book has clear international appeal and the journalistic writing style ensures it is accessible to a diverse readership. Automating Inequality will be of interest to academics, practitioners, policymakers and students in fields such as law, socio-legal studies, social policy, political science, data science, digital society and sociology, as well as anyone with an interest in social justice, equality and the need for change.
The case studies in this book provide policymakers and those working in the fields of technology and justice, with timely reminders of the social effects of these technological developments. For scholars of human rights, technology and social policy, and advocates of social justice, the takeaway message is that in this ever-evolving world of technology, the experiences of individuals such as Sophie are stark reminders that when systems ‘prioritize efficiency over empathy, tasks over families’, they end up ‘degrad[ing] the extraordinary value of ... emotional connection and commitments to each other’.[29] Eubanks argues that such systems ‘are not designed to provide care or secure social justice’, rather they are ‘built to manage the symptoms of austerity’.[30] While technological developments are typically framed as innovative, Eubanks convincingly argues in the afterword that unless we design our digital, political and legal systems ‘from an unshakeable belief that everyone deserves ... basic human rights’, we are inevitably ‘doomed to repeat the oppressive patterns of the past’.[31] Automating Inequality presents a clear call to action and the previously marginalised voices that run throughout this book deserve to be heard, listened to and acted upon.
Bibliography
Andrejevic, Mark. “The Big Data Divide.” International Journal of Communication 8, no 17 (2014): 1673-1689.
Eubanks, Virginia. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: Picador, St Martin’s Press, 2018.
McNutt, John G. Technology, Activism, and Social Justice in a Digital Age. Oxon: Oxford University Press, 2018.
*Dr Faith Gordon, Lecturer in Criminology at Monash University, Australia
1 Eubanks, Automating Inequality, 3.
[2] Eubanks, Automating Inequality, 3.
[3] Andrejevic, “The Big Data Divide.”
[4] Eubanks, Automating Inequality, 12.
[5] Eubanks, Automating Inequality, 231.
[6] Eubanks, Automating Inequality, 12.
[7] Eubanks, Automating Inequality, 231.
[8] Eubanks, Automating Inequality, 11.
[9] Eubanks, Automating Inequality, 11.
[10]Eubanks, Automating Inequality, 12.
[11] Eubanks, Automating Inequality, 13.
[12] Eubanks, Automating Inequality, 36.
[13] Eubanks, Automating Inequality, 36.
[14] Eubanks, Automating Inequality, 38.
[15] Eubanks, Automating Inequality, 41.
[16] Eubanks, Automating Inequality, 42.
[17] Eubanks, Automating Inequality, 43.
[18] Eubanks, Automating Inequality, 45.
[19] Eubanks, Automating Inequality, 45.
[20] Eubanks, Automating Inequality, 127.
[21] Eubanks, Automating Inequality, 152.
[22] Eubanks, Automating Inequality, 153.
[23] McNutt, Technology, Activism, and Social Justice in a Digital Age.
[24] Eubanks, Automating Inequality, 214-215.
[25] Eubanks, Automating Inequality, 211.
[26] Eubanks, Automating Inequality, 211.
[27] Eubanks, Automating Inequality, 212-213.
[28] Eubanks, Automating Inequality, 217.
[29] Eubanks, Automating Inequality, 224.
[30] Eubanks, Automating Inequality, 224-225.
[31] Eubanks, Automating Inequality, 225.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/LawTechHum/2019/10.html