Learning From and About Evaluation
August 16th, 2012
“The utility of evaluation must stretch beyond accountability and translate into learning and active exchange across the field of Jewish education and beyond.”
As my colleague Adene Sacks reflected upon leaving the Jim Joseph Foundation, the Foundation approaches evaluation in the spirit of being a learning organization. Evaluation was identified as a priority in the Foundation’s first strategic plan in 2006. The Jim Joseph Foundation’s program of evaluation already has yielded useable knowledge to benefit the Foundation’s philanthropy, even as the Foundation seeks to improve the process of conducting evaluations and expand the evaluation portfolio.
Evaluation results instruct and support the Foundation in the following ways:
With more than $7 million allocated for independent evaluations of major grants, the Foundation is currently reviewing its approach to evaluation, examining such key questions as: What is the basis for supporting an independent evaluation? How do we assure that the evaluations are “value added” to both the grant in question and other potential grants? What are the criteria for selecting evaluation consultants? How can we best share what we learn?
The Foundation has developed a core network of evaluation specialists who not only support formative learning that strengthens and improves the work of the grantee, but also provide evaluation data that informs the field of Jewish education. On two occasions when the Foundation convened evaluators, the sessions enriched the evaluations being conducted and helped to develop common evaluation questions across grants that have similar goals, objectives, and desired outcomes. For example, by convening evaluation consultant Shari Golan of SRI International – evaluator of Birthright Israel NEXT – and Steven Cohen of Research Success Technologies – evaluator of Hillel’s Senior Jewish Educator/Campus Entrepreneurs Initiative – the grantees, the Foundation, and others in the field gained an improved understanding about and alignment of survey questions regarding Jewish growth among the young adult participants. Specifically, the typology for measuring Jewish growth, developed by Research Success Technologies, can be applied to other young adult settings, such as Moishe House and Kevah, to yield cross grant learning.
In this regard, recognizing the interdependence between and among the evaluator, foundation professional(s), and grantee – while also understanding the necessary autonomy of each party – is a critical component to the learning taking place. Each of these stakeholders contributes to the initial articulation of a “theory of change” and the creation of a “logic model,” which are tools to identify agreed upon strategies and outcomes. This process and its resulting tools establish program benchmarks, guide the implementation of the initiative, and inform the subsequent evaluation. This final stage of independent evaluation has resulted in mid-course changes to grants and field learning to a broader audience.
As an example, with a goal of increasing the number of campers in the western United States attending residential Jewish camps, the Jim Joseph Foundation awarded a grant to the Foundation for Jewish Camp’s JWest Campership Initiative to offer campers financial incentives. Initially, the incentives were restricted to campers attending three-week camp sessions. After the program failed to achieve its target enrollment the first year, an evaluation by Summation Research determined that more campers would attend if incentives were offered for two-week camp sessions. Thus, during the second summer of JWest, the eligibility for incentives was changed to include two-week sessions, resulting in JWest exceeding its enrollment target. Now as the JWest incentive program concludes, the evaluation is informing the Foundation for Jewish Camp’s national One Happy Camper (OHC) incentive program. In Summer 2012, with the help of 36 OHC community partners, 28 camp partners, and four camp movements, OHC provided incentives to over 8,000 campers, as it continues to successfully increase the number of campers attending Jewish residential summer camp.
In some cases, the Foundation works with organizations that have a research infrastructure to support the evaluation. A grant to the Berman Policy Archives at New York University has initiated the Jewish Survey Question Bank (JSQB), which will provide a compilation of an estimated 18,000 questions from surveys –including Jewish population studies, program evaluations, and public opinion studies – which will be widely available to be used by evaluators and researchers. This unprecedented endeavor will promote more cross-program, cross-field data collection.
The Jim Joseph Foundation’s program of evaluation has and continues to hold promise to inform the Foundation’s work to foster compelling learning experiences. Each evaluation is characterized by a clear articulation of the purpose and value. A key component of all evaluations is the engagement of outstanding evaluation specialists, who can be part of a strong, appropriate interrelationship with the grantee and Foundation professionals. Collaboration with other funders and among evaluators can possibly broaden the scope of the Foundation’s evaluation program to ask common evaluation questions across grants that have similar goals, objectives, and desired outcomes. This could be a catalyst to conducting evaluation on a set of grants that have shared purposes and significant common grant outcomes.
Moving forward, the Foundation will continue to refine the evaluation program and seek new methods of learning to benefit existing grantees, guide future grantmaking, as well as benefit the field of Jewish education.