Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
Rate it:
Open Preview
2%
Flag icon
But that’s the thing about being targeted by an algorithm: you get a sense of a pattern in the digital noise, an electronic eye turned toward you, but you can’t put your finger on exactly what’s amiss. There is no requirement that you be notified when you are red-flagged. There is no sunshine law that compels companies to release the inner details of their digital fraud detection systems. With the notable exception of credit reporting, we have remarkably limited access to the equations, algorithms, and models that shape our life chances.
2%
Flag icon
Those groups seen as undeserving are singled out for punitive public policy and more intense surveillance, and the cycle begins again. It is a kind of collective red-flagging, a feedback loop of injustice.
4%
Flag icon
Across the country, poor and working-class people are targeted by new tools of digital poverty management and face life-threatening consequences as a result. Automated eligibility systems discourage them from claiming public resources that they need to survive and thrive. Complex integrated databases collect their most personal information, with few safeguards for privacy or data security, while offering almost nothing in return. Predictive models and algorithms tag them as risky investments and problematic parents. Vast complexes of social service, law enforcement, and neighborhood ...more
4%
Flag icon
Like earlier technological innovations in poverty management, digital tracking and automated decision-making hide poverty from the professional middle-class public and give the nation the ethical distance it needs to make inhuman choices: who gets food and who starves, who has housing and who remains homeless, and which families are broken up by the state. The digital poorhouse is part of a long American tradition. We manage the individual poor in order to escape our shared responsibility for eradicating poverty.
5%
Flag icon
Our new digital tools spring from punitive, moralistic views of poverty and create a system of high-tech containment and investigation. The digital poorhouse deters the poor from accessing public resources; polices their labor, spending, sexuality, and parenting; tries to predict their future behavior; and punishes and criminalizes those who do not comply with its dictates. In the process, it creates ever-finer moral distinctions between the “deserving” and “undeserving” poor, categorizations that rationalize our national failure to care for one another.
7%
Flag icon
began an all-out attack on public poor relief. Scientific charity argued for more rigorous, data-driven methods to separate the deserving poor from the undeserving. In-depth investigation was a mechanism of moral classification and social control. Each poor family became a “case” to be solved;
7%
Flag icon
Scientific charity treated the poor as criminal defendants by default.
8%
Flag icon
If the poorhouse was a machine that diverted the poor and working class from public resources, scientific charity was a technique of producing plausible deniability in elites.
8%
Flag icon
Roosevelt’s administration capitulated to white supremacy in ways that still bear bitter fruit. The Civilian Conservation Corps capped Black participation in federally supported work relief at 10 percent of available jobs, though African Americans experienced 80 percent unemployment in northern cities. The National Housing Act of 1934 redoubled the burden on Black neighborhoods by promoting residential segregation and encouraging mortgage redlining. The Wagner Act granted workers the right to organize, but allowed segregated trade unions. Most importantly, in response to threats that southern ...more
16%
Flag icon
Performance metrics designed to speed eligibility determinations created perverse incentives for call center workers to close cases prematurely. Timeliness could be improved by denying applications and then advising applicants to reapply, which required that they wait an additional 30 or 60 days for a new determination. Some administrative snafus were simple mistakes, integration problems, and technical glitches. But many errors were the result of inflexible rules that interpreted any deviation from the newly rigid application process, no matter how inconsequential or inadvertent, as an active ...more
16%
Flag icon
“I started reading her letters to figure out what to do, and where to go, and who to call,” Jeff remembered, “but you couldn’t get anywhere on the phone. It was like you were talking to a computer instead of a person.”
16%
Flag icon
The new system was “self-serve,” technology-focused, and presented call center workers with a list of tasks to complete rather than a docket of families to serve. No one worker had oversight of a case from beginning to end; when clients called the 1-800 number, they always spoke to a new worker. Because the Daniels administration saw relationships between caseworkers and clients as invitations to fraud, the system was designed to sever those links.
17%
Flag icon
“The most vulnerable of our population—the parents of children who didn’t have food to eat, who needed medical treatment, and the disabled who were not able to speak for themselves—were the ones who took it on the chin, took it in the gut, and in the heart.”
18%
Flag icon
By successfully reframing public benefits as property rather than charity, the welfare rights movement established that public assistance recipients must be provided due process under the Fourteenth Amendment of the Constitution.
18%
Flag icon
The far-reaching and fundamental changes introduced by Indiana’s automated system put it on an inevitable collision course with the poor’s right to due process guaranteed by Goldberg. A
19%
Flag icon
Under the eligibility automation, no single employee “owned” or oversaw a case; staff were responsible for responding to tasks that dropped into their queue in the WFMS. Cases were not handled in the county where applicants lived. Now, any employee could take any call from any county using the new system, even if they knew nothing about the caller’s local context.
23%
Flag icon
The goals of the project were consistent throughout the automation experiment: maximize efficiency and eliminate fraud by shifting to a task-based system and severing caseworker-to-client bonds. They were clearly reflected in contract metrics: response time in the call centers was a key performance indicator; determination accuracy was not. Efficiency and savings were built into the contract; transparency and due process were not.
24%
Flag icon
“The system doesn’t seem to be set up to help people. It seems to be set up to play gotcha,” said Chris Holly. “In our legal system it is better that ten guilty men go free than one innocent man go to jail. The modernization flipped that on its head.” Automated eligibility was based on the assumption that it is better for ten eligible applicants to be denied public benefits than for one ineligible person to receive them.
26%
Flag icon
The “social specs” for the automation were based on time-worn, race- and class-motivated assumptions about welfare recipients that were encoded into performance metrics and programmed into business processes: they are lazy and must be “prodded” into contributing to their own support, they are sneaky and prone to fraudulent claims, and their burdensome use of public resources must be repeatedly discouraged. Each of these assumptions relies on, and is bolstered by, race- and class-based stereotypes.
36%
Flag icon
There is a long history of social services and the police collaborating to criminalize the poor in the United States. The most direct parallel is Operation Talon, a joint effort of the Office of Inspector General and local welfare offices that mined food stamp data to identify those with outstanding warrants, and then lured them to appointments regarding their benefits. When targeted recipients arrived at the welfare office, they were arrested.
37%
Flag icon
In many neighborhoods, community policing is preferable to reactive, incident-driven law enforcement. But it also raises troubling questions. Community policing casts officers as social service or treatment professionals, roles for which they rarely have appropriate training. It pulls social service agencies into relationships with police that compromise their ability to serve the most marginalized people, who often have good reason to avoid law enforcement. Police presence at a social service organization is sufficient to turn away the most vulnerable unhoused, who might have outstanding ...more
39%
Flag icon
If homelessness is inevitable—like a disease or a natural disaster—then it is perfectly reasonable to use triage-oriented solutions that prioritize unhoused people for a chance at limited housing resources. But if homelessness is a human tragedy created by policy decisions and professional middle-class apathy, coordinated entry allows us to distance ourselves from the human impacts of our choice to not act decisively. As a system of moral valuation, coordinated entry is a machine for producing rationalization, for helping us convince ourselves that only the most deserving people are getting ...more
40%
Flag icon
And the next time he takes the VI-SPDAT, he will likely score lower. The model counts prison as housing. The system will see him as less vulnerable, and his prioritization score will slip even lower. He’ll stay trapped, too vigorous for intervention and too marginal to make a go of it without support. “I’m a criminal,” he said, “just for existing on the face of the earth.”
41%
Flag icon
Three-quarters of child welfare investigations involve neglect rather than physical, sexual, or emotional abuse. Where the line is drawn between the routine conditions of poverty and child neglect is particularly vexing. Many struggles common among poor families are officially defined as child maltreatment, including not having enough food, having inadequate or unsafe housing, lacking medical care, or leaving a child alone while you work. Unhoused families face particularly difficult challenges holding on to their children, as the very condition of being homeless is judged neglectful.
43%
Flag icon
Vaithianathan’s team developed a predictive model using 132 variables—including length of time on public benefits, past involvement with the child welfare system, mother’s age, whether or not the child was born to a single parent, mental health, and correctional history—to rate the maltreatment risk of children in the MSD’s historical data. They found that their algorithm could predict with “fair, approaching good” accuracy whether these children would have a “substantiated finding of maltreatment” by the time they turned five.
44%
Flag icon
We all tend to defer to machines, which can seem more neutral, more objective. But it is troubling that managers believe that if the intake screener and the computer’s assessments conflict, the human should learn from the model. The AFST, like all risk models, offers only probabilities, not perfect prediction. Though it might be able to identify patterns and trends, it is routinely wrong about individual cases.
44%
Flag icon
The AFST is supposed to support, not supplant, human decision-making in the call center. And yet, in practice, the algorithm seems to be training the intake workers.
47%
Flag icon
Patrick was investigated for medical neglect in the early 2000s when he was unable to afford his daughter Tabatha’s antibiotic prescription after an emergency room visit. When her condition worsened and he took her back to the ER the next day, a nurse threatened to call CYF on him. Frightened and angry, Patrick picked his daughter up and walked out. An investigation was opened. “They came late at night,” he remembers. “It was like 11 or 12 o’clock, my kids were already asleep. They came up with the police, told us why they were there, came in, looked at the house, looked where the girls were ...more
48%
Flag icon
Most parents reacted with fear and exasperation when I asked them about the AFST. Some think the system unfairly targets them for surveillance. Some find having their entire history as parents summed up in a single number dehumanizing. Some believe the model will make it even more difficult to exert the limited rights they have in the system.
48%
Flag icon
According to statistics gathered by the National Council of Juvenile and Family Court Judges, in 37 states, the Dominican Republic, and Puerto Rico, African American and Native American children are removed from their homes at rates that significantly exceed their representation in the general population. For example, in 2011, 51 percent of children in foster care in Alaska were Native American, though Native Americans make up only 17 percent of the youth population. In Illinois, 53 percent of the children in foster care were African American, though African Americans make up only 16 percent ...more
49%
Flag icon
A quarter of the predictive variables in the AFST are direct measures of poverty: they track use of means-tested programs such as TANF, Supplemental Security Income, SNAP, and county medical assistance.
49%
Flag icon
Nearly all of the indicators of child neglect are also indicators of poverty: lack of food, inadequate housing, unlicensed childcare, unreliable transportation, utility shutoffs, homelessness, lack of health care.
49%
Flag icon
Families avoid CYF if they can afford to, because the agency mixes two distinct and contradictory roles: provider of family support and investigator of maltreatment. Accepting resources means accepting the agency’s authority to remove your children. This is an invasive, terrifying trade-off that parents with other options are not likely to choose. Poor and working-class families feel forced to trade their rights to privacy, protection from unreasonable searches, and due process for a chance at the resources and services they need to keep their children safe.
50%
Flag icon
We might call this poverty profiling. Like racial profiling, poverty profiling targets individuals for extra scrutiny based not on their behavior but rather on a personal characteristic: living in poverty.
51%
Flag icon
But there are others who feel that, once they are in “the system,” microscopic scrutiny ups the ante on their parenting, raising the stakes so high that they are bound to lose. “We try to comply,” said Janine. “But look, we can’t do it all. You’re opening up a door for ten other things I’ve got to do. It’s just a downward spiral.”
51%
Flag icon
Parenting while poor means parenting in public.
52%
Flag icon
When and if Harriette becomes a mom, she’ll start out with a higher AFST score, because she had interactions with the child protective system as a kid. The assumption might be that she had a bad mother, and so she had no mental model of how to parent, and the county needs to keep an eye on her.
52%
Flag icon
Professional middle-class families reach out for support all the time: to therapists, private drug and alcohol rehabilitation, nannies, babysitters, afterschool programs, summer camps, tutors, and family doctors. But because it is all privately funded, none of those requests ends up in Allegheny County’s data warehouse. The same willingness to reach out for support by poor and working-class families, because they are asking for public resources, labels them a risk to their children in the AFST, even though CYF sees requesting resources as a positive attribute of parents.
54%
Flag icon
Under the right conditions—fiscal austerity, a governor looking to downsize public agencies, or a rash of child deaths—the AFST could easily become a machine for automatically removing children from their homes. It wouldn’t even require reprogramming the model. Today, if a family’s risk score exceeds 20, CYF must open an investigation. Tomorrow, a score of 20 might trigger an emergency removal. Or a score of 10 … or of 5.
55%
Flag icon
51 percent of Americans will spend at least a year below the poverty line between the ages of 20 and 65.
55%
Flag icon
Denial is exhausting and expensive. It is uncomfortable for individuals who must endure the cognitive dissonance required to both see and not-see reality. It contorts our physical geography, as we build infrastructure—suburbs, highways, private schools, and prisons—that allow the professional middle class to actively avoid sharing the lives of poor and working-class people. It weakens our social bonds as a political community; people who cannot meet each others’ eyes will find it very difficult to collectively govern.
55%
Flag icon
Our public policy fixates on attributing blame for poverty rather than remedying its effects or abolishing its causes. The obsession with “personal responsibility” makes our social safety net conditional on being morally blameless.
59%
Flag icon
We have all always lived in the world we built for the poor. We create a society that has no use for the disabled or the elderly, and then are cast aside when we are hurt or grow old. We measure human worth based only on the ability to earn a wage, and suffer in a world that undervalues care and community. We base our economy on exploiting the labor of racial and ethnic minorities, and watch lasting inequities snuff out human potential.
60%
Flag icon
When automated decision-making tools are not built to explicitly dismantle structural inequities, their speed and scale intensify them.
60%
Flag icon
The digital poorhouse replaces the sometimes-biased decision-making of frontline social workers with the rational discrimination of high-tech tools.
60%
Flag icon
The classism and racism of elites are math-washed, neutralized by technological mystification and data-based hocus-pocus.
65%
Flag icon
Such sustained, practiced empathy can change the “us/them” to a “we,” without obscuring the real differences in our experiences and life chances. The righteous anger that wells up when we recognize our common suffering is an earthshaking, structure-tumbling, visionary force.
66%
Flag icon
I tell them to do a quick “gut check” by answering two questions: Does the tool increase the self-determination and agency of the poor? Would the tool be tolerated if it was targeted at non-poor people?
67%
Flag icon
I will remember that the technologies I design are not aimed at data points, probabilities, or patterns, but at human beings.