Back to blog results

10월 28, 2020 By Sumo Logic

Automated Tech Perpetuates the Digital Poorhouse

What happens when your Medicaid is revoked while you’re in a hospital being treated for cancer?

Or what if the state government constantly threatens to take your kids away because you can’t afford antibiotics?

Or what if you lose your job and your home and can’t qualify for public assistance?

When events like these happen, some part of the American promise that we are all equal is left unfulfilled. In many cases today, eligibility is decided by technologies, built by humans to save money and streamline processes. But in reality, this tech often disregards Americans in the most vulnerable situations.

That’s what Virginia Eubanks gets at in her provocative book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor and what discussed with us on her Masters of Data Episode #22 “Fighting Data-Driven Inequality.”

The Digital Poorhouse You Might Not Know About

Eubanks uses the term digital poorhouse to describe the modern vestige of institutions we’ve used for 200 years to “take care” of the poor. It’s difficult to get through the barrier-ridden, complicated assistance process where you have to provide tons of personal information, and you may have to check in frequently or use a trackable Electronic Benefit Transfer card to qualify. Once you’re in, you still might not get assistance and end up lost in the digital poorhouse

As pointed out in a New York Times article on Eubanks, “Assistance may cost more than technology, at least in the short run, but as countries that are more generous with housing, cash and health care have learned, it’s the only effective way to fight poverty, along with poverty’s associated ills, from child abuse to drug addiction to homelessness.”

Let’s look at the digital poorhouse examples Eubanks lays out in her book.

Automating Welfare Eligibility

In 2006, Indiana Governor Mitch Daniels decided to privatize welfare (IBM was the primary contractor) and implement an automated system that would decide who was eligible.

“The metrics in that contract are so fascinating and troubling,” Eubanks observes, “Because the metrics really only have to do with the efficiency of the system, how quickly calls are answered, how quickly cases are closed. There's no metrics about whether or not the decisions made by the system are correct.”

She adds, “I had one source say if we had tried to build a system to divert people for public assistance on purpose, it wouldn't have worked any better than the one that we got.”

Immediately, caseworkers were separated from families they’d been serving, and in some cases, there were severe repercussions.

In late 2008, a woman received a letter from the state telling her she had to reapply for Medicaid, and when she couldn’t make the reapplication appointment because she was in the hospital being treated for ovarian cancer, her benefits were cut, citing “failure to cooperate.”

In an interview with NPR’s Ari Shapiro, Eubanks describes how this impacted the woman, “She lost her benefits. She couldn’t afford her medication. She lost her food stamps. She couldn’t pay her rent. She lost access to free transportation to her medical appointments.” That woman then died on March 1, 2009.

Predicting Child Abuse and Neglect

In Pittsburgh, Eubanks found a family repeatedly tagged by social services automation for child neglect, but not because of any actual physical abuse. Instead, the father in this family was suspected of “medical neglect” because the system had determined he couldn’t afford medicine for his daughter.

The Allegheny County’s Department of Human Services leveraged a predictive algorithm to forecast which children were most likely to be abused or neglected.

But the devil was in the data the algorithm trained itself on. Eubanks discovered that there were biases in the way the system judged maltreatment and removed a caseworker’s discretionary judgment.

Discretion was handed over to the automated system, which incorrectly predicted and identified neglect and abuse.

A Homelessness Registry

In Los Angeles, an algorithm scored tens of thousands of homeless people to categorize them for limited low-income housing. As Eubanks notes, this digital registry put all the hard decisions about housing in the hands of a machine.

Eubanks found a man who had lost his job and his house, but when he applied for public assistance, he wasn’t approved. After taking a closer look at the process, Eubanks found that the system only targeted two kinds of homeless applicants: the most severe cases (violence, substance abuse, mental illness) and the least severe cases (those who expect to be without a home or a job briefly).

The man Eubanks was working with had no job prospects AND he wasn’t an addict, so he didn’t qualify. For more than a decade now, he has been homeless off and on.

Why Machines Aren’t More Effective than People

An automated system is a terrible idea for serving the poor for three reasons.

Built-In Bias

In her MOD episode, Eubanks explains, “Technology's built by humans. It carries with it human assumptions and preoccupations. It is a deeply social product, and it then sort of loops back to affect the culture that it emerges from.”

No Profit Incentive for Effective Public Assistance

Unlike a shopping or game app, Governor Daniels’ welfare privatization generated no profits. There was no reason for IBM or other contractors to serve the poor, and in three years, 1 million poor people were denied the assistance they needed.

Americans Struggle to Confront Social Responsibility

“Like the county poorhouse and scientific charity before them,” Eubanks says, “digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state.”

What Is the Solution?

Eubanks points to a few successes, like the mRelief automated assistance program in Chicago that helps you find programs you’re eligible for.

The Administration for Children’s Services in New York City has been implementing predictive analytics to focus on child welfare, but the tech is not the final word on assessment. Caseworkers manage the data the automated system gathers.

Even looking back on the disproportionate numbers of minorities being targeted for abuse and neglect in Allegheny County, Eubanks adds “They've actually made some real progress over the last couple of years on making that disproportionality better.”

But the ultimate solution may be to vote policy makers into office who value social responsibility.

Summary

Everything Eubanks focuses on in Automating Inequality are central issues in the 2020 Presidential Election, from health care reform and financial aid to the question of what kind of America the people who live here want to see.

Virginia Eubanks rightly asks, “If we're doing everything right, why are we still producing systems that, from the point of view of their targets, still police, profile and punish poor and working-class communities?”

Complete visibility for DevSecOps

Reduce downtime and move from reactive to proactive monitoring.

Sumo Logic cloud-native SaaS analytics

Build, run, and secure modern applications and cloud infrastructures.

Start free trial

Sumo Logic

More posts by Sumo Logic.

People who read this also enjoyed