Author Hits Detroit to Discuss Dangers of Automating Public Assistance
There are few life events more stressful than suddenly losing a job. One of the safety net measures in place for displaced workers is unemployment insurance: reoccurring, state-administered benefit checks that eligible workers receive for a few months while trying to get back on their feet.
That’s how the system is supposed to work, anyway. In 2013, the state of Michigan decided to automate the process of determining who’s eligible for unemployment benefits by implementing a new program called the Michigan Integrated Data Automated System (MiDAS). The automated software replaced a 25-year-old mainframe system, and was supposed to ensure unemployment checks were only going to those who qualified for them, increase efficiency and responsiveness, and reduce costs via the shedding of 400 unemployment agency staffers who used to perform the duties being assumed by MiDAS. Except that’s not what happened at all.
Instead, between October 2013 and September 2015, a software glitch caused MiDAS to falsely accuse workers of fraud—an internal review eventually uncovered that the system had an astounding 93 percent error rate—causing absolute chaos for the more than 20,000 affected workers.
As Bridge magazine reported, the state set about aggressively punishing these mistaken cases of fraud by docking unemployment benefits and levying a 400 percent penalty on the amount of each so-called fraudulent check, going after recipients in court and making criminal referrals to prosecutors if they failed to pay up. People lost their homes, were forced into bankruptcy, and worse.
“It’s not necessarily about the technology itself, which intensifies societal inequalities we already have,” says Virginia Eubanks, author of Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor and an associate professor of political science at University of Albany, SUNY. Eubanks will be in Detroit tomorrow as part of a panel discussion sponsored by the Knight Foundation Fund and The Heat and Warmth Fund (THAW).
In her book, Eubanks examines three real-world cases in which automation went awry: the city of Los Angeles’s use of an algorithm to prioritize which homeless residents would get housing, a Pittsburgh child welfare agency’s attempt to predict which children might be potential victims of abuse via statistical model, and the state of Indiana’s switch to an automated system to determine who’s eligible for cash, medical, and food assistance.
Eubanks says she will talk most about the Indiana case during her Detroit discussion, as there are many parallels with what happened in Michigan. “In 2006, the state of Indiana attempted to automate the processes for assistance, which resulted in more than 1 million denials,” she explains. “I can’t speak to the governor’s intentions, but if they meant to build a system to deny as many people as possible, it couldn’t have worked any better. It’s clear the system diverted eligible people from benefits.”
Eubanks says these kinds of automated solutions are only going to get more popular, but as a country, we “have to get our souls right” in the way we deal with poverty if they’re going to work as intended.
“What is the reality of poverty? We tell ourselves it’s an aberration, but statistics say 50 percent of us will be below the poverty line at some point in our adult lives,” she notes. “Many will need to access assistance, but we make it about whether people are deserving enough. The technology won’t stop [progressing], so how do we cause less harm with the technology we’re building today?”
Eubanks says it boils down to two questions software developers must ask themselves if they’re designing products to determine public benefit eligibility: Does whatever tool they’re creating increase dignity and self-determination of poor families, and if it were made for the non-poor, would it even be tolerated at all?
“When I talk to designers, I tell them we’re building software systems like cars with no gears,” Eubanks says. “We have to build gears into the system to deal with twists and turns. We need to engage with a broader set of values in technology design, because the reality is that these systems are making political choices for us, so we need to be sure to build dignity into them.”
Part of the reason Eubanks is speaking in Detroit is to raise awareness about THAW, an organization that provides home heating help. THAW says it receives thousands of requests each year from low-income families and seniors who lack the computer skills or home Internet access to complete online applications for assistance. (In Michigan, new work requirement rules go into effect next week, which are likely to at least temporarily throw the automated eligibility system called MiBridges into disarray.)
Eubanks says that in the case of Indiana’s benefits system, “Hoosiers fought back and won. There was an incredible increase in suffering—people died. The state increased suffering and regular people didn’t stand for it.” They organized town hall meetings and took the media on tours to talk to affected families. Three years into a 10-year, billion-dollar contract, the governor canceled it. IBM eventually sued the state for breach of contract, but last year, the court found that IBM was actually the negligent party.
When pointing out how expensive the whole exercise was for everyone involved, Eubanks says these poorly functioning automated systems are only cheap at first. “In the end, it costs us all an enormous amount,” she adds.
Eubanks will take part in a panel discussion titled “No Computer. No Assistance? Is Michigan Turning the Lights Off on the Poor?” from 5-7 p.m. at the Bethel Transformation Community Center in Detroit. The event is free and open to the public, but pre-registration is required; click here to sign up.