Hardly a miracle: a black box AI system may be able to identify parents who have special needs 2023

The Associated Press received the data points behind various child welfare agency algorithms to see how they forecast whether children are at danger of harm as part of a yearlong investigation.

The Hackneys’ infant girl spent two weeks in a Pittsburgh hospital bed debilitated from dehydration. Her parents seldom left her side, sometimes sleeping on the fold-out sofa.

They stayed with their daughter 24/7 in treatment. The 8-month-old stopped rejecting bottles and gained weight again.

“She was doing great and we started asking when can she go home,” Lauren Hackney said. From then on, they stonewalled us and never said anything.”

Child welfare officials shocked the couple by taking their daughter and accusing them of neglect.

Lauren Hackney said, “They had custody papers and kidnapped her right there.” “We cried.”

Their 2-year-old daughter remains in foster care after a year. The Hackneys, who have developmental problems, don’t understand how sending their daughter to the hospital when she wouldn’t eat could be considered irresponsible enough to warrant her removal.

They question if the Allegheny County Department of Human Services’ artificial intelligence technology that predicts at-risk children targeted them because of their disability.

The U.S. Justice Department asks the same. The government is reviewing the county’s child welfare system to see if its use of the powerful algorithm discriminates against persons with disabilities or other protected groups, The Associated Press reported. The grandmother said federal civil rights attorneys will examine the Hackneys and Andrew’s mother, Cynde Hackney-Fierro, later this month.

Lauren Hackney has ADHD, which impacts her memory, and her husband, Andrew, has a comprehension issue and nerve damage from a stroke in his 20s. Their 7-month-old daughter refused bottles. They changed brands after traveling from Pennsylvania to West Virginia to find formula due to a statewide scarcity. Baby didn’t like it.

They stated her physician reassured them that newborns might be finicky eaters and suggested suggestions to help her eat again.

The same doctor advised them to take her to the ER when she became sluggish days later. After arriving with a hungry, dehydrated infant, the Hackneys think medical professionals notified child protective services.

They think the Allegheny Family Screening Tool, which county officials claim is routine for neglect charges, provided their information. A social worker questioned them and placed their daughter in foster care.

Allegheny County has been a real-world laboratory for testing AI-driven child welfare tools that analyze reams of data on local families to forecast which children are at risk in their homes for six years. According to the ACLU, child welfare agencies in 26 states and Washington, D.C., have contemplated adopting algorithmic techniques, and 11 have used them.

Based on interviews, internal emails, and court records, the Hackneys’ narrative shows these algorithms’ opacity. They can’t challenge Allegheny County’s “risk score” even as they fight for custody of their daughter since officials won’t reveal it. The county and tool developers have never disclosed specific criteria were utilized to assess the Hackneys’ parenting skills.

“It’s like you have a problem with someone who has a disability,” Andrew Hackney said in an interview from their suburban Pittsburgh apartment. “Then you definitely go after everyone with kids and a disability.”

The AP received many algorithms used by child welfare agencies, including those marked “CONFIDENTIAL,” as part of a yearlong investigation. They’ve utilized race, poverty, handicap, and family size to assess a family’s risk. These include smoking before pregnancy and child abuse or neglect allegations.

Measuring matters. ACLU researchers discovered that Allegheny’s algorithm highlighted persons who used county mental health and behavioral health programs, which may raise a child’s risk score by three points on a scale of 20.

Allegheny County spokesperson Mark Bertolet refused to discuss the Hackney case or the federal inquiry or the tool’s data, including ACLU criticism.

“As a principle, we do not comment on litigation or legal matters,” Bertolet stated in an email.

Aryele Bradford, Justice Department spokesperson, denied comment.

Non-magical

Child welfare algorithms produce risk scores by plugging massive quantities of public data about local families into complicated statistical models. Social workers use the number to evaluate which families require more investigation or attention, a life-or-death choice.

Local officials in Oregon during a foster care crisis and Los Angeles County following a string of high-profile child fatalities have used AI technology to achieve structural improvements.

In a stressed system, LA County Department of Children and Family Services Director Brandon Nichols thinks algorithms can detect high-risk families and improve results. He couldn’t describe his agency’s screening technique.

“We’re sort of the social work side of the home, not the Technology side,” Nichols said in an interview. “How the algorithm works, in some ways is, I don’t want to say magic to us, but it’s beyond our skill and experience.”

Nichols and two other child welfare agencies forwarded comprehensive queries regarding their AI technologies to their independent developers.

One Larimer County, Colorado official said she didn’t know how local households were assessed.

“The variables and weights used by the Larimer Decision Aide Tool are part of the code written by Auckland and hence we do not have this level of detail,” Larimer County Human Services spokeswoman Jill Maasch stated in an email.

The two academic developers that choose data points to create their algorithms have access to county data systems in Pennsylvania, California, and Colorado. In an email, Rhema Vaithianathan, a health economics professor at New Zealand’s Auckland University of Technology, and Emily Putnam-Hornstein, a social work professor at the University of North Carolina in Chapel Hill, stated their work is transparent and their computer models are public.

“In each jurisdiction where a model has been completely built we have given a description of fields that were utilized to create the tool, together with details as to the methodologies employed,” they stated via email.

The Allegheny County website has 241 pages of coded variables and statistical computations.

UNICEF and the Biden administration have praised Vaithianathan and Putnam-computer Hornstein’s models for reducing caseworkers’ workloads using basic criteria. They argue that child welfare officials must use all available data to protect children.

But, the AP discovered that such methods might set families up for separation by assessing their risk based on personal qualities they cannot control, such as race or handicap.

In Allegheny County, a large 1.2 million-person county bordering Ohio, the algorithm has accessed jail, juvenile probation, Medicaid, poverty, health, and birth information in a countywide “data warehouse.” The tool predicts the chance of a kid being put in foster care two years after a family is evaluated.

The AP said that county authorities are proud of their innovative approach and even built a neonatal algorithm. They monitor and update their risk score algorithm, deleting welfare benefits and birth records.

The AP repeatedly requested an interview with Vaithianathan and Putnam-Hornstein on how they chose their model data. In 2017, they published the methodologies used to construct Allegheny’s initial tool, including a footnote that called a statistical criterion “very arbitrary but based on trial and error.”

“This footnote relates to our examination of more than 800 characteristics from Allegheny’s data warehouse more than five years ago,” the developers noted via email.

Their county-specific design choices reflect that.

In the same 2017 paper, the developers admitted that racial data didn’t increase the model’s accuracy, but they proceeded to research it in Douglas County, Colorado, before deciding not to use it. The creators eliminated criminal history, ZIP code, and geographic factors in Los Angeles County to address community fears such a tool may harden racial bias, but they used them in Pittsburgh.

Developers cited their methodology documentation when confronted about discrepancies.

“We explain numerous indicators used to verify accuracy—while also documenting ‘external validations,’” the developers wrote via email.

Oregon’s Department of Human Services used a “fairness correction” and a child’s race to anticipate a family’s risk, influenced by Allegheny’s. When an April AP investigation found probable racial bias in such programs, the tool was discontinued in June.

Three individuals told the AP that Justice Department officials cited the same AP piece last autumn when federal civil rights attorneys began investigating Allegheny’s tool for possible prejudice. They talked anonymously because the Justice Department requested they not divulge secret talks. Two feared professional retribution.

Intelligence, parenting class

In October, the Hackneys’ lawyer filed a federal civil rights complaint questioning the screening tool’s usage in their case.

Allegheny’s tool has monitored family members with schizophrenia or mood problems. If any family members got Supplemental Security Income, a government disability payment, it is also measured. The county considers SSI payments because disabled children are more likely to be mistreated or neglected.

The county also noted disability-aligned data can be “predictive of the results” and “should come as no surprise that parents with disabilities… may also have a need for additional supports and services.” The county noted in an email that social workers abroad use mental health and other data to assess a parent’s capacity to securely care for a kid.

The Hackneys claim IQ testing, downtown court appearances, and parenting programs have exhausted them.

According to University of Minnesota researcher Traci LaLiberte, people with disabilities are overrepresented in the child welfare system but don’t hurt their children more.

LaLiberte said using disability data points in an algorithm promotes systemic prejudices and focuses on physiological qualities rather than behavior that social workers address.

The Los Angeles tool considers if any family children have received special schooling, developmental or mental health referrals, or mental health medicines.

“This is not specific to caseworkers who use our tool; it is usual for caseworkers to examine these criteria when considering appropriate supports and services,” the creators stated by email.

Before algorithms, child welfare distrusted disabled parents. LaLiberte stated they were sterilized and institutionalized until the 1970s. A 2012 federal survey found that 80% of parents with mental or intellectual problems lost custody of their children.

LaLiberte observed that few child welfare organizations in the U.S. require social workers to get disability training. She claimed a system that doesn’t know how to evaluate disabled parents’ parenting skills judges them.

Hackneys experienced this. Andrew Hackney told a social worker he fed the infant twice a day. He stated the worker was outraged and told him babies must eat more. He strained to explain that the girl’s mother, grandmother, and aunt fed her daily.

BETRAYED

Allegheny County officials said AI helps them “make judgments based on as much information as possible” and that the program uses data social workers already have.

Including decades-old records. Even if they were kids or never convicted, the Pittsburgh-area tool has monitored whether parents were on state benefits or had a criminal past.

The AP discovered that design decisions may stack the deck against individuals who grew up in poverty, hardening past injustices in the data, or against those with juvenile or criminal justice histories, long after society has provided redemption. Critics argue algorithms influence which families are targeted, creating a self-fulfilling prophesy.

The ACLU and charity Human Rights Data Analysis Group found that “these predictors have the impact of casting lasting suspicion and give no avenues of remedy for families stigmatized by these indicators.” “Their children are always at risk.”

Parents who have been scrutinized by social workers worry that child welfare algorithms won’t let them forget their pasts, no matter how old or unimportant.

Charity Chandler-Cole is one of them. After stealing underwear for her sister, she was placed in foster care. As an adult, social services arrived at her apartment when someone falsely claimed that her nephew, who lived with her, was thrown a grand piano, even though they didn’t own one.

Chandler-Cole argues the local algorithm might brand her for her foster care, juvenile probation, and false child abuse charge. She wonders if AI could appropriately judge that she was immediately cleared of mistreatment issues or that her nonviolent juvenile record was lawfully erased.

Chandler-Cole, a mother of four and Head of a court-affiliated agency that helps foster children, said, “A lot of these studies lack common sense.” You are automatically labeling us. It harms more.”

Wendy Garen, Chandler-colleague Cole’s commissioner, believes “more is better” and that risk assessment algorithms may improve the agency’s job by using all data.

INFLUENCE

The developers have initiated new initiatives with child welfare agencies in Northampton County, Pennsylvania, and Arapahoe County, Colorado, even though their models have been criticized for accuracy and fairness. California, Pennsylvania, New Zealand, and Chile have requested preliminary work.

Vaithianathan has lectured in Colombia and Australia on screening tools as their approaches have expanded. She recently advised Danish researchers and UAE authorities on how to utilize technology to focus child services.

“Rhema is one of the international leaders and her research can assist to shape the discussion in Denmark,” a Danish researcher said on LinkedIn last year about Vaithianathan’s consulting position for a local child welfare tool being launched.

Vaithianathan and Putnam-Hornstein co-authored a nationwide study last year for the U.S. Department of Health and Human Services that found Allegheny’s strategy could be replicated elsewhere.

HHS’ Administration for Children and Families spokesperson Debra Johnson declined to comment if the Justice Department’s examination would affect her agency’s support for AI-driven child welfare.

As finances shrink, cash-strapped organizations need better ways for social workers to safeguard vulnerable children. During a 2021 panel, Putnam-Hornstein said Allegheny’s “overall screen-in rate stayed entirely flat” since their technology was adopted.

However, foster care and the separation of families can have permanent developmental implications for the kid.

According to a 2012 HHS survey, 95% of newborns reported to child welfare agencies go through more than one caregiver and household change while in foster care, which can be traumatic.

While the Hackneys struggle to prove themselves to social services, their daughter has been in two foster homes for more than half of her short life.

They say they’re out of money fighting for their kid. Andrew Hackney had to cancel his cell phone service since his grocery shop salary barely covered meals. They struggle to afford legal costs and petrol for appointments.

Robin Frank, Andrew Hackney’s lawyer, said his daughter was diagnosed with a taste issue in February and has struggled to eat even in foster care.

They only get two-hour visits twice a week before she’s removed again. Lauren Hackney breaks down worrying her kid may be adopted and forget her family. They want to do what most parents do—put their child to bed in her own bed.

I want my kid back. I miss holding her. “And of course, I miss her little giggly laugh,” Andrew Hackney remarked as his daughter raced toward him excitedly during a recent visit. “It’s painful. You don’t know.”

Burke reported from SF. Pittsburgh-based Associated Press video journalist Jessie Wardarski and photojournalist Maye-E Wong contributed to this story.

You May Also Like

About the Author: Sanjh Vishwakarma

Leave a Reply