Measuring Outcomes Across Grantees and Over Time
March 31st, 2016
When the Jim Joseph Foundation‘s evaluators’ consortium met last November, the overall focus was on the long road ahead toward developing a common set of measures — survey items, interview schedules, frameworks for documenting distinctive features of programs — to be used as outcomes and indicators of Jewish learning and growth for teens and young adults. Consortium members and the foundation were especially excited to learn about the work led by George Washington University to develop a common set of long-term outcomes and shared metrics to improve the foundation’s ability to look at programs and outcomes across grantees and over time. A key part of this endeavor will be an online menu — developed in consultation with evaluation experts and practitioners — from which grantees can choose to measure their program outcomes.
Already, the GW team is making significant progress toward this end. As part of foundation efforts to inform and advance the field, we think the process and lessons related to these efforts are important to share.
To begin, the GW team reviewed the desired outcomes and evaluation reports from a dozen past foundation grants representing a variety of programs. Six grants address the foundation’s strategic priority of providing immersive and ongoing Jewish experiences for teens and young adults. Six others address the strategic priority of educating Jewish educators and leaders.
For this latter strategic priority, the GW team offers a welcome “outsider” perspective, bringing strong expertise on outcomes in secular education and teacher training to the development of common outcomes for the foundation’s Jewish educator grants. How, for example, do other programs measure quality and teacher retention? Both of these qualities are desired outcomes for the foundation’s grants. Yet, if these qualities are not measured with common metrics, the foundation will never be able to properly determine whether its grantmaking in this area is successful. GW’s expertise and strong relationship with the foundation are beginning to provide important answers to these challenges.
To be clear, the effort to evaluate the impact of the foundation’s grantmaking in this area is a work in progress, but the unique and collaborative relationships engendered by our Evaluators’ Consortium makes it possible. In fact, members of the consortium have volunteered to be advisors, working with GW, on the project to develop common outcomes for Jewish educator grants while providing valuable insights of their own based on their work, together and individually, with foundation grantees. It’s worth noting that this work intersects in several ways — with current field-building grants such as the Jewish Survey Question Bank; with CASJE, which aims to bring the rigor and standards of general education applied research to Jewish education; with the Jewish Teen Education and Engagement Funder Collaborative evaluation; and with the ongoing evaluation work that grantees and evaluation consultants engage in on a regular basis.We look forward to sharing the framework of our long-term outcomes and to using these new measurement tools. We then will begin to test whether these tools really do help grantees measure progress against their goals and improve; help evaluators document that progress and report out useful and valuable lessons learned; and help the foundation gather information on long-term outcomes across several grants. Along with these specific tools and outcomes, we are confident that related learnings about our field-building efforts, work with teens, and ongoing evaluation will be of use to the field and will contribute to even more effective Jewish education.
Stacie Cherner is a senior program officer at the Jim Joseph Foundation.