The Institute for Child Success, a strong and committed partner of our Help Me Grow South Carolina affiliate, recently convened its first statewide meeting — the South Carolina Early Childhood Research Symposium.
We’ve been impressed with the vision and capacity of ICS since our introduction in 2008, but the symposium’s bold focus on the challenges of measuring impact and making research findings accessible to the public and policy makers reflects their commitment to addressing the most critical and complex of issues.
Our experiences in the HMG National Center validate the wisdom of ICS’ focus on the importance of measuring impact. For example, the ability of Karen Powell and our Louisiana affiliate to draw upon Social Innovation Fund dollars from the Corporation for National and Community Service was predicated on our ability to demonstrate that HMG is evidenced-based and has a measurable impact on children’s healthy development. Fortunately, findings from the Protective Factors Framework study of Marcia Hughes offered sufficient evidence of impact to satisfy the funder.
Even individual donors are increasingly mindful of the importance of measuring impact. While recently soliciting support from a potential donor, a retired corporate leader, for our new Office for Community Child Health (OCCH), we found that questions about our plans for measuring impact far exceeded queries about the structure or function of OCCH or the community-oriented programs under its aegis. Our ability to speak to performance metrics in a meaningful way led to a generous donation.
ICS is also at the forefront in exploring strategies to make research accessible to those influencing policy. While research data enable interventions to achieve the status of “evidence-based,” such data is rarely sufficient to change opinions and practices. The cognitive scientist Roger Schank helps us to understand why this is so, by explaining that “[h]umans are not ideally set up to understand logic; they are ideally set up to understand stories.”
Considering this insight, as I set the stage for the Symposium’s research presentations, I opted to share stories on measurement from our HMG experience, including:
• The need to gather data to “silence the early skeptics” who feared that our success in early detection of at-risk children would lead to only frustration and disappointment when families couldn’t find accessible programs and services to meet their children’s developmental needs. While we did not believe this would likely be a problem, we committed to gathering data on gaps and capacity issues to inform our advocacy in system building, thereby quieting the skeptics.
• The importance of carefully defining success to avoid “the glass half-empty” or “eye of the beholder” phenomenon. In sharing the early HMG experience with Karen Davis, president of The Commonwealth Fund, we proudly referenced our success in linking one-half of those Hartford families accessing HMG to community-based programs and services. Ms. Davis was unimpressed, suggesting that the lost opportunity with the remaining 50% of referrals could be viewed as a failure. In our minds, the need to overcome the many barriers to referral with a disadvantaged, underserved population justified our positive assessment. Fortunately, Davis ultimately agreed and provided funding to support replication in other states. We learned the importance of proactively defining success.
• The need to choose the most effective measures of children’s developmental outcomes. We originally proposed to use the Early Development Inventory (EDI)— a population-based measure of children’s developmental status — to demonstrate the impact of HMG on children’s healthy development. Our plans to correlate EDI findings with the rate of HMG penetration in a population to demonstrate HMG efficacy were widely and consistently criticized as foolhardy and, given the complexity of factors contributing to children’s developmental outcomes, likely to underestimate the benefits of HMG. In fact, “mission impossible” was the constant feedback we received for our plan! Instead, we were encouraged to consider the impact of HMG on proxy measures of children’s developmental outcomes. As a result, we embraced the Protective Factors Framework of Strengthening Families to prove HMG efficacy and gain widespread acceptance of HMG as an evidenced-based intervention.
• The rules for measuring impact are often driven by external considerations. Finally, I spoke about the influence of health care reform’s relentless pursuit of immediate results, the so-called “scorable savings.” The focus on immediate returns-on-investment undermines the brilliant and compelling analysis of economist and Nobel laureate James Heckman on the long-term return-on-investment of early childhood programs. Faced with the reality of needing to demonstrate short-term, more immediate cost benefit, we developed the notion of “de-medicalization” to show the real-time cost savings of HMG through directing referrals to community-based programs and services, rather than medical consultations.
Congratulations to the ICS and our SC colleagues on their willlingness to tackle the tough issues. Our improved understanding of how to measure impact and make data accessible to policy leaders is crucial to our common cause!