The critical nature of an evaluation framework is first communicated through the Foundation’s instructions for major grant proposals to articulate outcomes and evidence of success—in other words, “What changes do you hope occur and how will you know?” Grantees are encouraged to identify outcomes related to output (i.e., number of participants) and to longer-term outcomes (i.e., changes in knowledge, attitudes, or behaviors).
The Foundation typically supports contracts with external evaluators for major, multi-year grant investments. These evaluators use selective research methodologies to document, describe, measure, interpret, and analyze how well a particular program or intervention succeeds and identifies areas for improvement. The first key activity of most evaluations is to develop a logic model based on the proposed outcomes and evidence of success, but these may be refined once all stakeholders have weighed in—and bought in, which is important for a program’s success.
Another key activity in designing an evaluation framework is to develop evaluation questions. What does the grantee want to learn over the course of the grant period based on the logic model? An individual evaluation is developed and implemented through a highly interactive process, in which personnel from the evaluation team, the grantee-partner, and the Jim Joseph Foundation may work together. In these cases, the Foundation uses this process to build mutual understanding and trust with the grantee-partner so that both parties learn together about a particular grant and recognize what will help the grantee achieve positive outcomes.
In addition, the Foundation invests in cross-portfolio evaluation, which involves several grantees who are working toward similar outcomes or with similar populations or in similar settings. With cross-portfolio evaluation, the goal is less to build the capacity of individual grantees and more to build the field and to answer questions that the Foundation asks about its investments and strategies. The Foundation is developing common measurements across grants that have related goals, objectives, and desired outcomes; and is conducting evaluations of cohorts of grants that have shared purposes and similar grant outcomes.
Finally, any mature and professional field needs a base of knowledge from which to improve and move forward, to understand what works and why. Applied research is a key investment priority for the Foundation with the goal of improving the quality of knowledge that can be used to guide the work of Jewish education and Jewish learning.
To dive deeper into our Grant Evaluations, go to the Learning & Resources section.Go
The Jim Joseph Foundation is pleased to be a part of the Funder & Evaluator Affinity Network (FEAN). Beginning in 2019, FEAN members assembled around practice areas chosen collaboratively as areas most urgently in need of change: Strategy & Practice, Evaluators of Color, Knowledge Sharing, Global Challenges, and Collaboration & Partnership. Over the course of 2019 and 2020, five membership teams developed products which propose tangible steps to make change in the field.