The Engineer's Notebook Blog

The Engineer's Notebook

The Engineer's Notebook is a shared blog for entries that don't fit into a specific CR4 blog. Topics may range from grammar to physics and could be research or or an individual's thoughts - like you'd jot down in a well-used notebook.

Previous in Blog: Using AI, Sensors to "Read" Minds   Next in Blog: Doing Business on Craigslist Is Good for the Environment
Close
Close
Close

Close to Home: Algorithms Work to Uncover When Children are in Domestic Danger

Posted April 23, 2018 11:37 AM by IronWoman

In this digital age of humans and computers working side by side, it’s refreshing to hear of an application that is making moral advancements for the benefit of humanity. In Dan Hurley’s NY Times article Can an Algorithm Tell When Kids are in Danger?, readers are given a first-hand look into real-life scenarios faced by the social services department in Allegheny County, Penn. Hurley goes on to explain, discussed below, the incorporation of an algorithm earlier this year that helps to detect the likelihood of child abuse cases. Moreover, we’ll discuss how this change in the system has affected Pennsylvania’s success in scoping out problem homes and saving lives.

Written January 2nd, 2018, Hurley’s article delves deep into the heartbreaking challenges faced by Allegheny County’s social services department and families throughout the town. According to Hurley, “Nationally, 42% of 4 million allegations (2015) with 7.2 million children were screened out…because of judgment calls, opinions, biases and beliefs.” The piece goes on to explain “and yet more U.S. children died in 2015 as a result of abuse and neglect…than died of cancer.” Since discovering that over half of around 14,000 child abuse allegations per year in the county are deemed unsubstantial, Allegheny has incorporated a predictive-analytics algorithm to offer a second opinion on each case and call that comes in.

The way it works is this: the screening tool, once prompted by a representative, displays a vertical color bar from GREEN 1B (low) to RED 20T (high). The color bar revealed is then compared with 4 years of similar stored data from jails, psychiatric services, public-welfare benefits, drug and alcohol treatment centers and more. The algorithm also takes into account each family’s drug and abuse history as well as their mental and developmental health. From there, the hotline operator can more accurately determine whether the children involved are in severe and urgent danger.

This AI process came about in 2012, when social scientists Emily Putnam-Hornstein (University of Southern California) and Rhema Vaithianathan (professor at New Zealand’s Auckland University of Technology) began working with algorithms. In 2015, they turned their focus to the call-screening process. Regarding predictive analytics, both researchers wanted to discover how they could improve upon handling maltreatment allegations, specifically in Allegheny County.

In the time that followed, Hornstein and Vaithianathan linked several dozen data points from past family services cases to predict how children would fare afterward. They believed this was a good idea because a computer with analytics can comprehend extensive data, unlike a screener’s human brain. What Hornstein and Vaithianathan found in their search was astonishing: 48% of the lowest-risk families were screened into the system while only 27% of the households deemed high-risk were screened out. Hurley goes on to say that “of 18 calls to C.Y.F. (Office of Children, Youth, & Families) between 2010 and 2014 in which a child was later killed or gravely injured as a result of paternal maltreatment, 8 cases (44%) had been screened out as not worth investigation.”

After the results came in, both researchers set out on a mission to bring predictive analytics permanently into Allegheny County. The Allegheny Family Screening Tool, unlike those before it, is owned strictly by the county. It has public workings with criteria described in academic publications and, upon its arrival, has since been picked apart by local officials. As determined after an independent ethics review of the predictive-analytics program, however, “by adding objective risk measures into the screening process, the screening tool is seen by many officials in Allegheny County as a way to limit the effects of bias,” again looking more uniformly and evenly at all of the variables regarding abuse cases.

A few months later, come December, Marc Cherna – director of Allegheny County’s Department of Human Services – and his team found that families were being treated more consistently based on risk scores. The percentage of low-risk cases being recommended for investigation had lowered from half to one-third while high-risk calls were now screened in more often by a few percentage points. With this positive news, “having demonstrated in its first year of operation that more high-risk cases are now being flagged for investigation, Allegheny’s Family Screening Tool is drawing interest from child-protection agencies around the country.” In fact, Professor Brett Drake from Brown School of Social Work at Washington University claims Hornstein and Vaithianathan’s tool is one of the most exciting child protection innovations in the last twenty years.

In the last year that it has been established and utilized, deputy director of human services and leader of data analysis department Erin Dalton and Marc Cherna have raised Allegheny Family Screening Tool’s accuracy from 78% to over 90%. Dalton admits that the biggest challenge has been getting the employees on board: “getting them to trust that a score on a computer screen is telling them something real is in process.”

What do you as the reader think of this technology?

References

https://www.nytimes.com/2018/01/02/magazine/can-an-algorithm-tell-when-kids-are-in-danger.html

https://www.pbs.org/newshour/nation/can-big-data-save-these-children (picture)

https://www.aisp.upenn.edu/network-site/allegheny-county-pa/ (picture)

Reply

Interested in this topic? By joining CR4 you can "subscribe" to
this discussion and receive notification when new comments are added.
Guru
Hobbies - DIY Welding - Wannabeabettawelda

Join Date: May 2007
Location: Annapolis, Maryland
Posts: 7195
Good Answers: 416
#1

Close to Home: Algorithms Work to Uncover When Children are in Domestic Danger

04/23/2018 1:25 PM

One can certainly hope this improves the situation.

Will we ever have the political will to impose mandatory sterilization on those who have repeatedly proven themselves unworthy of raising children?

Reply
Reply to Blog Entry
Interested in this topic? By joining CR4 you can "subscribe" to
this discussion and receive notification when new comments are added.

Previous in Blog: Using AI, Sensors to "Read" Minds   Next in Blog: Doing Business on Craigslist Is Good for the Environment

Advertisement