Panel: Dr. Kristine Piescher, Director of Research and Evaluation, CASCW; Jamie Sorenson, Director of Child Safety and Permanency Division, Department of Human Services; and Ron Kresha, Minnesota House of Representatives
Moderator: Dr. Traci LaLiberte, Executive Director, CASCW

The College of Education and Human Development (CEHD) hosts a policy breakfast each Fall and Spring on a timely topic. The topic this fall was the use of data for decision-making in Minnesota’s child protection system.

Jean Quam, Dean of CEHD, introduced the moderator, Traci LaLiberte, who reflected on the increase in media coverage on child protection in Minnesota since 2014, when a young child named Eric Dean died. As the media began to pay closer attention to our child welfare system a series of events occurred, beginning with the Governor’s Task Force on Child Protection, which developed 93 recommendations . Four separate work groups began studying the recommendations and developing practice changes in the form of new guidelines for counties, legislative action, and planning for reforming the Minnesota Child Welfare Training System.

KSTP recently featured a news report focusing on the need for increased access to and use of data in child protection, which included a discussion of caseload sizes, and the connection to data that could make decision-making easier for workers. This media story set the tone for the panel discussion which was framed around a greater emphasis on incorporating data into the decision-making process for Minnesota’s Child Protection System.

Jamie Sorenson began the discussion by explaining that the current data dashboard used in the child protection systems across the state is aligned with required federal data measures. The public can access information, which has been converted to Tableau format for better analysis and a more user-friendly dashboard display. The Data Unit in the DHS Child Safety and Permanency Division looks at how quickly workers see a child, the length of time children stay in foster care, and whether they achieve permanency.

The Minnesota SACWIS system is called Social Service Information System (SSIS). This system has been in place a long time. Caseworkers enter data into the system, and DHS extracts it to determine how they’re complying with federal performance measures. This data is not accessible in real time, but instead requires uploads at intervals. We want an accessible, robust, and real time data system to look at reports and ongoing case management. There’s a general mistrust in the data, due to issues with accuracy and completion. It’s largely case-level data, as that’s what the system was developed to be used for.

How does tribal data interface in the system? Some tribal agencies, like the Prairie Island Mdewakanton and Shakopee Mdewakanton Sioux, have their own data system for child protection cases. The White Earth Nation and Leech Lake Band of Ojibwe enter data into SSIS, and Red Lake Nation inputs some into SSIS. Therefore, some tribal data is included in the DHS annual child welfare report, but not all. To find information for American Indian children in the system, we want to continue to receive data and increase the amount of data being shared by tribes who administer their own child protection agencies.

SSIS wasn’t intended to be used for research purposes, but the data is used for that purpose now. The data wasn’t collected for research purposes, rather for practice-level decision making. When we tap the data for research and evaluation we need to look at the workers’ measures rather than develop research measures. In doing this, we can effectively analyze the information and draw conclusions about trends and performance improvement strategies.

Case-level data doesn’t contain deeper measures of child well-being to perform refined analysis, predictive analytics , or a broader perspective of what’s happening in families. Additionally, data systems exist in silos. SSIS and DHS’ data systems capture a lot of service systems, including child care assistance, child protection, and children’s mental health, but linkages between these systems are challenging.

Data about children and families is housed in DHS, Minnesota Department of Education, and Minnesota Department of Health and it can be hard to compile and extract data across systems. Minn-LiNK crosses data from all three areas to better understand child well-being and the effectiveness of programs and policies in Minnesota across agencies.

Do we have the right data and are we using the data in a way that’s helpful?

Rep. Kresha responded that he believes we have the right data, but that it’s tucked in so many places that it’s hard to access. “We would like to see more collaboration to share data across systems, but we need to address people’s fears for data privacy. DHS and MDE don’t currently share information easily, but maybe we haven’t given them the right tools, “ Kresha said. Children who are served by the DHS system go to school; MDE can benefit from information collected by DHS to provide adequate school support. Kresha continued, “We need to use and share data that’s beneficial to kids and not be afraid that we’ll label them.”

Jamie Sorenson added that he would like to better understand our problem statement on the use of data across systems and work toward identifying the solution.

Sorenson continued that key players should be able to share information using a shared database, but the matrix of federal legislation like CAPTA contain regulations that we can’t upset or we would jeopardize our funding stream. Additionally, we have a matrix of laws around confidentiality, as well as ethical considerations and informed consent. Shared information also may impact workers’ willingness to input information into the system.

Dr. Piescher explained that there is an interoperability between systems – educators and child welfare professionals need to share some information in order to get children the services that they need. Minnesota has been a leader in research with cross-systems data. The Center for Advanced Studies in Child Welfare has had an agreement since 2003 to access data about children in families in several systems to answer questions that can’t be answered in their own systems, such as the achievement gap for children in child protection. Security and confidentiality issues are the primary priority. Questions remain that once we have systems set up to share data, who will be allowed access to that information?

Rep. Kresha stated he would never advocate for a teacher to have access to all data, but some pieces that would be useful to them in understanding needs. Educators state they could pay more attention to special needs of children if they had more information about child protection history for families, for example.

Legislatively, data is often interpreted inaccurately , both by advocates and legislators. It is imperative that we have a group of legislators that understand how to properly interpret data and make informed decisions. Too often, admits Rep. Kresha, legislators use data in the wrong way to push their bills and make stronger points.

Dr. LaLiberte explained that from a research standpoint, you can use data to prove your point in many ways. We can ask a question and then see what the data tells us, but we cannot go into a study with the intention of showing a particular desired outcome. It’s a misuse of data to seek data to simply prove your point. The onus is on researchers to properly interpret data conclusions, but also on practitioners to be honest about findings when they share information about data.

Sharing data can make people nervous, but it’s how we use and interpret it. There’s a growing movement to push the use of predictive analytics in child protection. New Jersey and California use predictive analytics to guide policy and practice in child welfare. New Jersey also uses it to look at which families we can anticipate would be in the greatest crisis and facing the greatest safety threats. They don’t want to use this in a way to profile families. However, they can identify trends, such as families who have been assigned more than ten caseworkers, combined with risk factors and safety threats, to assess cases monthly and provide a different type of intervention.

Is this a direction we should go in Minnesota?

Jamie Sorenson stated that in terms of risk measures, some Minnesota counties use a Structured Decision Making (SDM) tool to identify risks of recurring maltreatment. We could apply that tool more widely and consistently in Minnesota. When we see fatalities and near-fatalities in children, sometimes that’s the first time the child is identified by the system, but other times they had been referred to the system previously. We can also do a “social autopsy” to determine the risk factors that may help identify potential for serious future harm, more than is being done currently. However, predictive analytics do not replace highly trained and skilled social workers and supervisors. Social workers and supervisors still need to use their professional judgment to determine how to proceed in every individual case. When we identify a child at risk, what do we need to do to make that child safe?

Dr. Piescher said that she agrees and emphasized that everyone assesses risk, whether they use a standardized tool or previous experience. Predictive analytics is important and is becoming more popular. However, when is it appropriate and for what purposes? To determine how to allocate resources? Level of intervention?

Rep. Kresha said that when we look at data, we are looking at what contributes to a given situation. It is important that we have enough faith in the data to use it to change policy. We know all these factors. We can move big levers at the policy level. Whatever decision we are going to make, we hope that we see a reduction in the number of cases five years down the line because we improved the system response. If we ever take away the decision making for the social workers, then what we’ve done is wrong.

“If a child is behind in school,” said Dr. Piescher, it’s not because a social worker came to the door, but rather events that have occurred for a number of years. We can’t find that in the data. We need the right indicators that we think are meaningful. The same risk factors are present in every system and we need to get underneath that and see what’s driving that beyond just poverty. Are there more nuanced pieces of that and do we need to direct resources in a different way?

Dr. LaLiberte said that she has often spoken with reporters about a complicated issue for 45 minutes, but in the end it is apparent that they just want a nugget of information. Reporters are looking for sound bites, drilled down to the least complex denominator so people can understand it. All nuance, context, complexities in decision-making are not simple, and talking about them in simple ways can derail us.