The Evidence Generation initiative focused on the application of evidence in the pursuit of more effective youth justice systems, but the initiative was not simply an effort to expand the use of existing evidence-based programs. We helped agencies and programs develop their own evidence. We worked on growing new evidence in support of innovative models, whether the models were designed to affect client behavior, to improve systems, or to influence law, policy, and practice.
Policymakers, justice professionals, and their community partners always need high-quality information about the impact of justice programs and policies. In setting priorities, program models that have been proven effective and efficient should always be preferred over programs that have never been tested by rigorous evaluation.Such preferences, however, should not become sacrosanct and policymakers should not be encouraged to think of “the” evidence base as a known, finite, and fixed source of knowledge. The search for evidence is an ongoing process. We should never assume that all possible evidence-based practices have already been discovered and tested.
We see evaluation as part of a development journey that learning organizations must follow. The generation of evidence is an orientation to practice, part of one’s daily routine. It is not a professional service that youth-serving organizations may simply delegate to outside contractors or consultants, and it is not something that can put off until the last minute. Evaluation evidence should be a key part of program design from the very beginning.
Youth justice practitioners must embrace the quest for better evidence, and they need to appreciate the limitations of existing evaluation research. There will never be enough funding or enough competent researchers to test every single aspect of the youth justice system or to assess the value of every single practice, program, or policy.
Research funding is scarce and competitive. Only a few models have been fortunate enough to receive the sustained investments required to generate high-quality evaluation evidence. More importantly, research investments are not impartial and unbiased. Investments in research are shaped by budgetary limitations, administrative obstacles, evaluation complexities, political values, organizational self-interest, and simple marketing and promotion. In any field, the practices established by rigorous research may be only a small portion of the total operational environment.
The published findings of past evaluations are not a sufficient basis for making all of the choices involved in building and operating the justice system of the future. Researchers and practitioners much work together and collaborate in the search for better evidence, and the most urgent need is for evidence about the sort of innovative programs and practices that are likely to be over-looked in the competitive struggle for grants and contracts.
If an agency is devoted to cognitive-based therapy or youth mentoring, there are plenty of evaluations to review and a lot of evidence that could be used to design a new program. If, on the other hand, a program is supposed to provide educational enhancements, employment readiness, or creative activities, there is much less evidence, and not because these are bad ideas, but because they either haven’t been fully developed or they have yet to be tested by competent evaluators.
Moreover, some critical elements in the youth justice system cannot be evaluated with recidivism outcomes. A youth services reform might be designed simply to lower costs. A restorative justice program might need to increase the speed of victim compensation. A legal aid organization might want to measure its effectiveness in representing youth in court. There is no compendium of evidence-based practices for such issues, but researchers should be prepared to work with youth justice agencies on these areas as well as those related to recidivism.
This was the mission of the Evidence Generation initiative — to help innovative organizations in the youth justice sector to develop stronger capacities for data collection and measurement in order to prepare themselves for future evaluation opportunities.
Our philosophy was consistent with the views expressed in this video from the Ontario Centre of Excellence for Child and Youth Mental Health in Canada.
Note: The approach adopted by Evidence Generation was inspired in part by an essay written in 2009 by the Urban Institute’s Akiva Liberman. See Advocating Evidence-Generating Policies: A Role for the ASC. The Criminologist- Official Newsletter of the American Society of Criminology, Vol. 34(1): 1, 3-5. The short essay takes a critical view of conventional evidence-based thinking. It reminds readers to heed the advice of Donald Campbell, who viewed all social policies as experimental and all expected outcomes as hypotheses. Justice agencies should pursue the continued generation of evidence and not simply follow the existing knowledge base as if it were fixed and finite.