Vicky Arias, FISM News
An artificial intelligence program used in child welfare cases in the Pittsburgh area is being investigated by the Department of Justice for potentially targeting families with disabilities, the Associated Press recently reported.
The program, known as the Allegheny Family Screening Tool (AFST), uses an algorithm to predict the chances of a child being placed in foster care within two years of a family being investigated for child neglect.
The algorithm acts as a predictor, using data points to produce a score. The higher the score, the more likely the odds are that a child will be removed from the home and go into foster care.
Individuals who are opposed to the tool say that it isn’t transparent enough as “families and their attorneys can never be sure of the algorithm’s role … because they aren’t allowed to know the scores,” a report from the AP found.
Data points used in the tool to determine the status of a child’s welfare include “personal data and birth, Medicaid, substance abuse, mental health, jail, and probation records, among other government data sets,” according to Cheddar News. Additionally, an analysis from the AP found that the AFST, and tools similar to it, look at income and, sometimes, “race, zip code, disabilities and [a family’s] use of public welfare benefits.”
Critics of the AFST say that many parents with disabilities rely on public welfare for help and, therefore, having a tool that uses data points, such as welfare participation, contributes to a discriminatory bias against these parents.
According to the National Council on Disability, parents with disabilities undergo difficulties and, at times, discrimination in holding onto custody of their children.
“[Disabled] parents are the only distinct community of Americans who must struggle to retain custody of their children,” the council states. “Removal rates where parents have a psychiatric disability have been found to be as high as 70 percent to 80 percent; where the parent has an intellectual disability, 40 percent to 80 percent. In families where the parental disability is physical, 13 percent have reported discriminatory treatment in custody cases.”
Advocates of the tool say that it reduces the burden of a high caseload on child welfare workers and helps aid neglected children.
The director of the Allegheny, Penn. Department of Human Services, Erin Dalton, explained that caseworkers are overwhelmed, making it difficult to process thousands of child neglect cases.
“Workers, whoever they are, shouldn’t be asked to make, in a given year [up to] 16,000 of these kinds of decisions with incredibly imperfect information,” Dalton said.
Opponents say the program may enable workers to become overly reliant upon a computer program in determining life-altering decisions for families.
An attorney from Pittsburgh, Robin Frank, is representing a parent with an intellectual disability who is trying to get their daughter back after she was placed in foster care.
“I think it’s important for people to be aware of what their rights are and to the extent that we don’t have a lot of information when there seemingly are valid questions about the algorithm, it’s important to have some oversight,” Frank said.
The screening tool is designed, allegedly, to be used as an aid for child safety case workers to help them in deciding whether or not to pursue a child neglect case. However, the final decision to pursue a case rests with the caseworker.
Neglect involves anything from “inadequate housing to poor hygiene” but doesn’t include “physical or sexual abuse.” Abuse cases are handled by a different department and are not subject to the algorithm.