PITTSBURGH (AP) — Justice Department scrutinizes controversial artificial intelligence tools used by Pittsburgh-area child protection services amid concerns they may lead to discrimination against families with disabilities Then, the Associated Press found out.
Federal civil rights attorneys are concerned that an AP investigation found potential bias and transparency issues with an opaque algorithm designed to assess the risk level of families reported for child welfare concerns in Allegheny County. after it becomes clear.
The AP learned that several civil rights complaints were filed in the fall about the Allegheny Family Screening Tool, which is used by social workers to determine which families to investigate.
Attorneys for the Department of Justice’s Office for Civil Rights cited the AP investigation, two sources said, raising concerns about how the algorithm would reinforce stigma against people with disabilities, including family members with mental health issues. requested to file a formal complaint detailing the
A third person told the AP that the same group of federal civil rights attorneys held a broad-based debate in November about how algorithmic tools could exacerbate inequalities, including for people with disabilities. As part of the conversation, I spoke with them.
All three sources, who spoke to the Associated Press on condition of anonymity, said the Justice Department had asked them not to discuss the confidential conversations, and two said they also feared expert retaliation.
Justice Department spokesman Win Hornbuckle declined to comment.
Algorithms use pools of information to turn data points into predictions, such as online shopping, identifying crime hotspots, or hiring workers. Many child welfare agencies in the United States are considering adopting such tools as part of their work with children and families.
While there has been widespread debate about the moral consequences of using artificial intelligence for child protection services, the Department of Justice’s interest in the pioneering Allegheny algorithm marks an important shift to potential legal implications. increase.
Proponents see the algorithm as a promising way to make the strained child protective services system more complete and efficient, and child welfare officials should do everything at their disposal to keep children from being abused. However, critics say the inclusion of data points collected primarily from the poor discriminates against families on the basis of race, income, disability, or other external characteristics. I am concerned that it may be automated.
Robin Frank, a veteran Pittsburgh family law attorney and vocal critic of the Allegheny Algorithm, filed a complaint with the Department of Justice on behalf of an intellectually disabled client fighting to get her daughter out of foster care in October. said. AP obtained a copy of the complaint and raised concerns about how the Allegheny Family Screening Tool assesses family risk.
“I think it’s important for people to be aware of what their rights are. Even if there seems to be a valid question about an algorithm, unless you have a lot of information, some degree of oversight is important.” ‘ said Frank.
Allegheny County Department of Human Services spokesman Mark Bertlett said in an email that the department had not heard from the Department of Justice and declined an interview request.
“We are not aware of any concerns about the inclusion of these variables from past evaluations of the study group or community feedback on the (Allegheny Family Screening Tool),” County said, referring to previous research and outreach on the tool. explained.
Allegheny County helps predict the risk of children being removed from their homes after reports of abuse, so its algorithms use data points associated with disabilities in children, parents, and other members of local households. The county has updated the algorithm several times, adding that it may also remove data points related to the disability.
The Allegheny Family Screening Tool was specifically designed to predict the risk of a child being placed in foster care two years after family screening. We use mountains of detailed personal data collected from government datasets such as Child Welfare History, Birth, Medicaid, Substance Abuse, Mental Health, Prison and Probation Records. When the algorithm calculates a risk score from 1 to 20, the higher the number, the higher the risk. The risk score alone does not determine what happens in that case.
The AP first laid bare racial bias and transparency concerns in an article last April. This article focuses on the Allegheny tool and how its statistical calculations determine which families should be investigated for neglect by Social Her workers. , but is a separate category from physical or sexual abuse, which is investigated separately in Pennsylvania and not covered by the algorithm.
Child welfare research may enable vulnerable families to receive more support and services, but it may also result in children being taken away for foster care, ultimately ending parental rights. .
The county says hotline workers decide what to do with family cases and can override the tool’s recommendations at any time. It also emphasizes that the tool only applies at the first stage when families may be involved in the child welfare process. Another social her worker, who later investigates, as well as the family and their lawyers, do not know the score.
In use since 2016, Allegheny’s algorithm may be extracted from data related to Supplemental Security Income, a Social Security Administration program that provides monthly payments to adults and children with disabilities. In addition to diagnosing psychiatric, behavioral, and neurodevelopmental disorders, including schizophrenia and mood disorders, AP found.
The county said that when disability data is included, it is “predictive of outcome” and “it is not surprising that parents with disabilities may need additional support and services.” The county added that it has other risk assessment programs that use data about mental health and other conditions that can affect a parent’s ability to care for their child.
AP obtained records showing hundreds of specific variables used to calculate risk scores for families reported to Child Protective Services. This includes public data that powers the Allegheny Algorithm and similar tools deployed in child welfare systems elsewhere in the United States.
AP’s analysis of the Allegheny’s algorithm and algorithms inspired by the Allegheny’s algorithm of Los Angeles County, California, Douglas County, Colorado, and Allegheny’s, Oregon, measured a series of people with low-income and other disadvantaged demographics. Reveals controversial data points in and sometimes measures families. race, zip code, disability, and their use of public welfare benefits;
Since the AP study was published, Oregon has scrapped its algorithm over racial equity concerns, and the White House Office of Science and Technology Policy has announced that the agency will endorse algorithms as part of the country’s first “AI bill.” stressed the need to be more transparent with parents and social workers about how they are deploying of right. ”
The Justice Department has shown broad interest in investigating algorithms in recent years, said Christy Lopez, a Georgetown University law professor who previously led some of the Justice Department’s civil rights division litigation and investigations.
In a keynote address about a year ago, Assistant Attorney General Kristen Clark warned that AI technology “has a profound impact on the rights of people with disabilities,” and her department recently announced its use of AI tools in hiring. It then issued guidance to employers saying it could infringe on Americans. Disability Law.
“As civil rights investigators, they’re trying to figure out what’s going on,” Lopez said of the Justice Department’s scrutiny of Allegheny’s tools. “This appears to be a priority for the department investigating the extent to which algorithms perpetuate discriminatory practices.”
Traci LaLiberte, a child welfare and disability expert at the University of Minnesota, said the Justice Department’s investigation stood out to her because federal authorities have largely turned to local child welfare agencies.
“The Department of Justice is far from child welfare,” Laliberte said. “You have to reach a fairly significant level of concern to take the time to participate.”
Emily Putnam Hornstein and Rema Vaissianathan, developers of Allegheny’s algorithm and other tools like it, withheld Allegheny County’s response on the inner workings of the algorithm. In an email, they said they were unaware of any Justice Department scrutiny related to the algorithm.
Researchers and community members have expressed concern that some of the data that powers child welfare algorithms may reinforce historical biases against marginalized populations within child protection services. This includes parents with disabilities, a community that is a protected class under federal civil rights law.
The Americans with Disabilities Act discriminates based on disability, which can include a wide range of conditions, from diabetes, cancer, and hearing loss to mental and behavioral health checks such as intellectual disability, ADHD, depression, and schizophrenia. is prohibited.
LaLiberte has published a study detailing how parents with disabilities are disproportionately affected by the child welfare system. She disputed the idea that any algorithm would use failure-related data points. That’s because you value traits that you can’t change, not behavior, she said.
“If it’s not part of the behavior, including it in[the algorithm]will bias it,” LaLiberte said.
___
Burke reported from San Francisco.
____
Follow Sally Ho and Garance Burke on Twitter at @_sallyho and @garanceburke. Contact AP’s Global Investigative Team at Investigative@ap.org or https://www.ap.org/tips/.