Child Protective Services (CPS) is using a “Predictive Analytics” computer program to determine whether or not parents should be allowed to keep their children.
The State Department’s 2019 human trafficking report confirms that trafficking is inextricably linked to the foster care system, where CPS deposits children that they take away from their parents.
A report by Richard Wexler for the National Coalition for Child Protection Reform found in no uncertain terms that predictive analytics has a racist track record.
“ProPublica reports that predictive analytics already has gone terribly wrong in criminal justice, falsely flagging Black defendants as future criminals and underestimating risk if the defendant is white. A new analysis of ProPublica’s data confirmed their findings.
In child welfare, a New Zealand experiment in predictive analytics touted as a great success wrongly predicted child abuse more than half the time.
In Los Angeles County, another experiment was hailed as a huge success in spite of a “false positive” rate of more than 95 percent. And that experiment was conducted by the private, for-profit software company that wanted to sell its wares to the county.
The same company is developing a new approach in Florida. This one targets poor people. It compares birth records to three other databases: child welfare system involvement, public assistance and “mothers who had been involved in the state’s home visiting program.”
So listen-up “at-risk” new mothers: In the world of predictive analytics, the fact that you reached out for help when you needed it and accepted assistance on how to be better parents isn’t a sign of strength – it’s a reason to consider you suspect, and make it more likely that your children will be taken away.'”
National Coalition for Child Protection Reform passage ends
A father from Ohio named Aaron described on The Campaign Show with Patrick Howley on Patriots Soapbox that predictive analytics were used against him as he fought to keep his daughter, who was taken away for three years by CPS and placed in foster care.
The New York Times magazine ran an appropriately Orwellian recent headline, “Can An Algorithm Tell When Kids Are In Danger?”
Bypass Tech Censorship!
Facebook, Twitter and Google are actively restricting conservative content through biased algorithms. Silicon Valley doesn't want you to read our articles. Bypass the censorship, sign up for our newsletter now!
Join the conversation!
We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. Thank you for partnering with us to maintain fruitful conversation.