Why should a corrections agency care about CBA? An interview with Minnesota DOC’s Grant Duwe
Grant Duwe is the research director for the Minnesota Department of Corrections (DOC). We asked him about the role of research, evaluation, and cost-benefit analysis (CBA) in a corrections agency.
1. Please tell us about your work and why CBA is of interest to your agency. How has CBA informed the DOC’s planning and decision making?
Although CBAs have been relatively rare within corrections, the field has generally embraced the idea that decisions should be based on empirical evidence. Over the past 40 years, we’ve seen the growth of the “what works” literature and, more recently, a greater emphasis on evidence-based practices. While the determination of whether an intervention works has been confined mainly to its impact on recidivism, CBAs can give us additional information that’s critical to figuring out how effective an intervention is.
Despite the long-term growth in correctional spending at both the state and federal levels, many correctional agencies remain underfunded in the wake of the massive increase in the size of the prison population since the 1970s. By telling us which programs, policies, and practices are cost-effective, CBAs can help create more efficient correctional systems through improved planning, operations, and resource-allocation decisions. In doing so, CBAs can play a pivotal role in making sure correctional agencies are responsible stewards of the resources taxpayers provide.
Because much of our CBA work is relatively recent, I anticipate greater statewide use of the CBA results in the future, particularly when it comes to making programming decisions. For example, there have been a few recent instances in which the recidivism findings from program evaluations we’ve conducted have had an impact on legislative funding decisions.
In 2009 and 2010, we completed evaluations showing that prison-based sex offender and chemical dependency treatment both significantly reduce recidivism. When Minnesota legislators were presented with these data during the 2013 legislative session, they voted to fund increased treatment capacity in the future.
In 2011, an evaluation we completed on EMPLOY, a prisoner reentry employment program run by the Minnesota DOC, showed that the program significantly improved employment and recidivism outcomes. Due to the promising results from this evaluation, efforts are under way to fund a similar program and increase the availability of employment assistance for Minnesota prisoners.
2. This summer you published a paper, What Works with Minnesota Prisoners: A Summary of the Effects of Correctional Programming on Recidivism, Employment, and Cost Avoidance. What struck you most about the findings, and what are the policy implications?
The paper was influenced by the work done by the Washington State Institute for Public Policy (WSIPP). For a number of years, WSIPP has attempted to identify cost-effective criminal justice interventions by developing a cost-benefit model, gathering data specific to Washington State, and conducting meta-analyses of program evaluations across a variety of areas. This report is similar, to what WSIPP has done, except that all of the results are based on evaluations of 13 Minnesota correctional programs we’ve completed over the past five years. The paper summarizes the findings on recidivism, employment, and cost avoidance from these studies.
When we compare the evaluation results across programs, several interesting things emerged. First, while a correctional program’s impact on recidivism is important in determining its cost-effectiveness, the program’s total volume—in terms of the number of participants—is nearly as important. The 10 effective programs identified in this report reduce costs by an estimated $36 million each year. Nearly two-thirds of that amount came from prison-based chemical dependency treatment, which had the highest offender enrollment compared to the other programs. Prison-based educational programming had a modest impact on recidivism, but it produced the second-highest annual cost reduction due to 1) the relatively high number of offenders earning degrees in prison; and 2) the lower costs associated with providing this programming.
Second, programs with relatively limited enrollment tend to be more costly to operate. This is not to say that smaller programs can’t be cost-effective. But to reduce costs, small-scale programs need to have a large impact on recidivism and/or be effective in decreasing violent reoffending. Although violent crime is, fortunately, less common than, say, drug or property offenses, it’s also a lot more costly to society. Interventions that reduce violent recidivism, particularly homicides, are much more likely to yield impressive returns on investment.
Finally, because programs that provide offenders with pro-social sources of support often rely on community volunteers, they tend to have lower enrollment. Nevertheless, these programs delivered some of the strongest recidivism findings presented in the report. To be sure, recruiting suitable volunteers from the community to work with offenders is often a challenge. But if the size of these programs could be expanded, the evidence suggests that they might be especially cost-effective.
3. The Minnesota DOC has done other CBAs. How do you decide which topics get studied? What else do you plan to study, in terms of costs and benefits?
Every research project we undertake begins with a request or recommendation to conduct an evaluation. The sources of the requests have varied, as they have come from DOC staff (such as program directors, executive-level staff, or even members of the DOC’s research and evaluation unit) as well as those outside the agency, such as state legislators. Some evaluations we’ve conducted have been a requirement of grant funding. Regardless of the origins of the request, DOC research staff must prepare a proposal, which is then subject to a fairly rigorous internal review process, culminating with a final review by Commissioner Tom Roy and his executive-level staff.
Now that we’ve completed CBAs on the 13 programs we’ve evaluated so far, we plan to make CBA a standard part of our evaluation process. Several of our current projects include evaluations of release planning for serious and persistent mentally ill offenders, a prisoner reentry program funded through the Second Chance Act, and prison visitation by community volunteers. We will not only assess whether these interventions had an impact on recidivism, but we’ll look at how cost-effective they are.
4. A recent report by Pew’s Results First Initiative listed Minnesota as one of the 10 states leading the way in using CBA to drive policy decisions. What thoughts or advice do you have for people who want to do more with CBA? What are some helpful resources—and how they can avoid potential pitfalls?
Of the CBAs that have been done within corrections, quite a few have been supplementary, ad hoc analyses that were prepared as part of a larger program-evaluation report for a specific agency. By contrast, relatively few of the CBAs have been published in peer-reviewed academic journals. Many of the existing CBAs have been disseminated on a relatively limited basis and have been largely insulated from critical examination.
When it comes to improving our knowledge and practice regarding CBAs, I think one of the best things we could do would be to focus more on publishing in peer-reviewed academic journals, for several reasons. First, compared to an agency-specific report, a peer-reviewed publication would likely reach a larger audience. Wider dissemination of CBAs would lead to a bigger shared knowledge base, which would be conducive to making advances in the use of CBAs within corrections.
Second, critical feedback provided during the “blind” peer-review process would ultimately improve the quality of the CBAs conducted. Of the more than 20 research projects we’ve completed at the Minnesota DOC over the past five years, most have been published in peer-reviewed academic journals. Some of the best learning experiences I’ve recently had as a researcher have come from the constructive comments delivered during the blind peer-review process.
Last, publishing in a peer-reviewed journal will increase the odds that the CBA is a credible, objective piece of research. One of the concerns with evaluations produced by in-house research staff—or even outside researchers selected and/or paid by the agency to conduct the evaluation—is that they’re not impartial (enough). Although the peer-review process is not a panacea, it minimizes the likelihood that an evaluation or, more specifically, a CBA, is largely an effort to burnish the agency’s image. If our recent experience at the Minnesota DOC is any guide, taking the extra step to publish research in peer-reviewed academic journals is well worth the effort. Not only does it increase confidence in the research product among key stakeholders, but it also helps build the trust that’s needed to influence policy and practice.