Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
Rate it:
Open Preview
1%
Flag icon
Automated eligibility systems, ranking algorithms, and predictive risk models control which neighborhoods get policed, which families attain needed resources, who is short-listed for employment, and who is investigated for fraud.
2%
Flag icon
The transactions that LePage found suspicious represented only 0.03 percent of the 1.1 million cash withdrawals completed during the time period, and the data only showed where cash was withdrawn, not how it was spent. But the governor used the public data disclosure to suggest that TANF families were defrauding taxpayers by buying liquor, lottery tickets, and cigarettes with their benefits. Lawmakers and the professional middle-class public eagerly embraced the misleading tale he spun from a tenuous thread of data.
3%
Flag icon
The legislation was not intended to work; it was intended to heap stigma on social programs and reinforce the cultural narrative that those who access public assistance are criminal, lazy, spendthrift addicts.
3%
Flag icon
The skyrocketing economic insecurity of the last decade has been accompanied by an equally rapid rise of sophisticated data-based technologies in public services: predictive algorithms, risk models, and automated eligibility systems.
3%
Flag icon
Technologies of poverty management are not neutral. They are shaped by our nation’s fear of economic insecurity and hatred of the poor; they in turn shape the politics and experience of poverty.
4%
Flag icon
Complex integrated databases collect their most personal information, with few safeguards for privacy or data security, while offering almost nothing in return.
4%
Flag icon
And while the most sweeping digital decision-making tools are tested in what could be called “low rights environments” where there are few expectations of political accountability and transparency, systems first designed for the poor will eventually be used on everyone.
7%
Flag icon
they had following the 1819 Panic, white economic elites responded to the growing militancy of poor and working-class people by attacking welfare. They asked: How can legitimate need be tested in a communal lodging house? How can one enforce work and provide free soup at the same time? In response, a new kind of social reform—the scientific charity movement—began an all-out attack on public poor relief. Scientific charity argued for more rigorous, data-driven methods to separate the deserving poor from the undeserving. In-depth investigation was a mechanism of moral classification and social ...more
14%
Flag icon
Governor Daniels famously applied a Yellow Pages test to government services. If a product or service is listed in the Yellow Pages, he insisted, the government shouldn’t provide it. So it was not surprising when, shortly after his election in 2004, Daniels began an aggressive campaign to privatize many of the state’s public services, including the Indiana Toll Road, the Bureau of Motor Vehicles, and the state’s public assistance programs.
16%
Flag icon
The automation’s impacts were devastating for poor and working-class Hoosiers. Between 2006 and 2008, the state of Indiana denied more than a million applications for food stamps, Medicaid, and cash benefits, a 54 percent increase compared to the three years prior to automation.
16%
Flag icon
No one worker had oversight of a case from beginning to end; when clients called the 1-800 number, they always spoke to a new worker. Because the Daniels administration saw relationships between caseworkers and clients as invitations to fraud, the system was designed to sever those links.
16%
Flag icon
The FSSA packed up all its existing records and moved them to a central storage facility in Indianapolis. These paper records were set aside in case the state needed them for appeal hearings, but were not scanned into the modernized system. All current recipients of TANF, food stamps/SNAP, and Medicaid were required to turn in all their supporting documentation again, no matter how long they had been receiving benefits.
20%
Flag icon
By summer 2009, there was a backlog of nearly 32,000 cases and 6,500 people were waiting for appeal hearings. According to their monthly management reports, the FSSA was reporting incredibly high food stamp eligibility error rates to the USDA. Between 2006 and 2008, the combined error rate more than tripled, from 5.9 percent to 19.4 percent. Most of that growth was in the negative error rate: 12.2 percent of those applying for food stamps were being incorrectly denied. The state’s long wait times for food stamps decisions attracted notice and threats of financial penalties from the USDA.
21%
Flag icon
Public libraries were particularly hard-hit by the automation project. “We had lines of desperate people waiting for help,” said Muncie Public Library director Ginny Nilles, now retired. V-CAN partners received little to no compensation, training, or oversight to do what amounted to volunteer casework. Librarians trained community volunteers to help patrons submit welfare applications, but the library was quickly overwhelmed. The situation worsened when budget cuts required reducing hours and laying off staff.
23%
Flag icon
“Neither party deserves to win this case,” wrote Marion Superior Court Judge David Dreyer in his judgment in favor of IBM. “This story represents a ‘perfect storm’ of misguided government policy and overzealous corporate ambition. Overall, both parties are to blame and Indiana’s taxpayers are left as apparent losers.… There is nothing in this case, or the Court’s power,… [to] remedy the lost taxpayer money or personal suffering of needy Hoosiers.”
23%
Flag icon
The goals of the project were consistent throughout the automation experiment: maximize efficiency and eliminate fraud by shifting to a task-based system and severing caseworker-to-client bonds. They were clearly reflected in contract metrics: response time in the call centers was a key performance indicator; determination accuracy was not. Efficiency and savings were built into the contract; transparency and due process were not.
24%
Flag icon
The problem with the automation experiment was not that the IBM/ACS coalition failed to deliver, it was that the state and its private partners refused to anticipate or address the system’s human costs.
26%
Flag icon
But automated decision-making in our current welfare system acts a lot like older, atavistic forms of punishment and containment. It filters and diverts. It is a gatekeeper, not a facilitator.
26%
Flag icon
I arrived in Los Angeles in December 2015 to explore its coordinated entry system, which is intended to match the county’s most vulnerable unhoused people with appropriate available resources. Touted as the Match.com of homeless services, the coordinated entry approach has become wildly popular across the country in the last half decade.
29%
Flag icon
Home for Good, a collaboration between the United Way of Greater Los Angeles and the Los Angeles Area Chamber of Commerce, combined prioritization, housing first, and technology-forward approaches to launch a coordinated entry program in 2013.
29%
Flag icon
If survey-takers request the more complete privacy notice, they learn that their information will be shared with 168 different organizations, including city governments, rescue missions, nonprofit housing developers, health-care providers, hospitals, religious organizations, addiction recovery centers, the University of California, Los Angeles, and the Los Angeles Police Department (LAPD) “when required by law or for law enforcement purposes … to prevent a serious threat to health or safety.” The consent is valid for seven years.
36%
Flag icon
There is a long history of social services and the police collaborating to criminalize the poor in the United States. The most direct parallel is Operation Talon, a joint effort of the Office of Inspector General and local welfare offices that mined food stamp data to identify those with outstanding warrants, and then lured them to appointments regarding their benefits. When targeted recipients arrived at the welfare office, they were arrested.
39%
Flag icon
The proponents of the coordinated entry system, like many who seek to harness computational power for social justice, tend to find affinity with systems engineering approaches to social problems. These perspectives assume that complex controversies can be solved by getting correct information where it needs to go as efficiently as possible. In this model, political conflict arises primarily from a lack of information.
41%
Flag icon
Many struggles common among poor families are officially defined as child maltreatment, including not having enough food, having inadequate or unsafe housing, lacking medical care, or leaving a child alone while you work. Unhoused families face particularly difficult challenges holding on to their children, as the very condition of being homeless is judged neglectful.
44%
Flag icon
We all tend to defer to machines, which can seem more neutral, more objective. But it is troubling that managers believe that if the intake screener and the computer’s assessments conflict, the human should learn from the model.
44%
Flag icon
In the face of the seeming authority and objectivity of a computerized score, risk aversion, or an understandable excess of caution with children’s lives at stake, it is easy to see how a flashing red number might short-circuit an intake screener’s professional judgment. The AFST is supposed to support, not supplant, human decision-making in the call center. And yet, in practice, the algorithm seems to be training the intake workers.
45%
Flag icon
Data scientist Cathy O’Neil has written that “models are opinions embedded in mathematics.”8 Models are useful because they let us strip out extraneous information and focus only on what is most critical to the outcomes we are trying to predict. But they are also abstractions. Choices about what goes into them reflect the priorities and preoccupations of their creators. Human decision-making is reflected in three key components of the AFST: outcome variables, predictive variables, and validation data.
45%
Flag icon
Predictive modeling requires clear, unambiguous measures with lots of associated data in order to function accurately.
45%
Flag icon
stepwise probit regression,
49%
Flag icon
Families avoid CYF if they can afford to, because the agency mixes two distinct and contradictory roles: provider of family support and investigator of maltreatment. Accepting resources means accepting the agency’s authority to remove your children. This is an invasive, terrifying trade-off that parents with other options are not likely to choose. Poor and working-class families feel forced to trade their rights to privacy, protection from unreasonable searches, and due process for a chance at the resources and services they need to keep their children safe.
53%
Flag icon
The automated discretion of predictive models is the discretion of the few. Human discretion is the discretion of the many.
55%
Flag icon
According to Mark Rank’s groundbreaking life-course research, 51 percent of Americans will spend at least a year below the poverty line between the ages of 20 and 65. Two-thirds of them will access a means-tested public benefit: TANF, General Assistance, Supplemental Security Income, Housing Assistance, SNAP, or Medicaid.1 And yet we pretend that poverty is a puzzling aberration that happens only to a tiny minority of pathological people.
55%
Flag icon
“cultural denial.”
56%
Flag icon
The poorhouse preceded the Constitution as an American institution by 125 years. It is mere fantasy to think that a statistical model or a ranking algorithm will magically upend culture, policies, and institutions built over centuries.
57%
Flag icon
This social sorting works out well for those at the top and the bottom of the rankings. But if, like Gary Boatwright, the cost of your survival exceeds potential taxpayer savings, your life is de-prioritized.
59%
Flag icon
There would be a widespread and immediate outcry that the policy is unfair, dangerous, and probably illegal. Users would rush to find other services for email, appointments, document storage, video conferencing, and web search.
Nicolette
I take issue with this example because no one does this even now. People are too intertwined and lack personal agency with their data - we have breaches and negligence every day and none of this outrage occurs.
60%
Flag icon
They obliquely accuse their subordinates, often working-class people, of being the primary source of racist and classist outcomes in their organizations. Then, managers and technocrats hire economists and engineers to build more “objective” systems to root out the human foibles of their economic inferiors.
61%
Flag icon
The digital poorhouse also limits equity as equal value by freezing its targets in time, portraying them as aggregates of their most difficult choices. Equity requires the ability to develop and evolve.
65%
Flag icon
If you lack even one of the economic rights promised by the 1948 Universal Declaration of Human Rights—including health care, housing, a living-wage job, and quality education—PPEHRC counts you among the poor. The redefinition is tactical, an attempt to help poor and working-class people see themselves reflected in each others’ experiences.
66%
Flag icon
Does the tool increase the self-determination and agency of the poor? Would the tool be tolerated if it was targeted at non-poor people?
68%
Flag icon
Our ethical evolution still lags behind our technological revolutions.