We Can Always Learn More: Our Approach to Evaluation and Research
June 9th, 2022
Evaluation and research are integral to the Jim Joseph Foundation’s strategic philanthropy. Since the foundation’s inception in 2006, our approach to evaluation and research has evolved. Today, learnings both from major studies and smaller ones yield benefits for the foundation, grantee-partners and the field of Jewish education and engagement. By continuing to share this aspect of our strategic philanthropy publicly, we hope other peer funders, practitioner organizations and the broader field can glean lessons learned that inform their own efforts toward evaluation and research.
Since the earliest foundation grants, strong support of program evaluation reflected the foundation’s values of data-informed decision-making, accountability and transparency. The Jim Joseph Foundation historically has invested about 2% to 3% of its annual spending in learning. Most foundation grants have a percentage of their budget, at times up to 10%, dedicated to evaluation usually by an independent contractor. Supporting program evaluation in this manner is integral to the foundation’s strategy to build the capacity of its grantee-partner organizations. Evaluators assist grantee-partners to define measurable outcomes, articulate logic models, design data collection instruments and ultimately make sense of findings so that future activities have more likelihood of reaching successful outcomes. These efforts are proven to help programs and entire organizations achieve sustainability and ultimately, greater impact and outcomes. The degree to which organizations eventually can conduct evaluation internally is a positive and desirable outcome for the foundation. As we do with all aspects of our approach to philanthropy, we share more about the foundation’s evaluation process here.
In addition to evaluation of individual programs, the foundation invests in applied research and, increasingly, cross-portfolio evaluation (evaluation that involves several grantees working toward similar outcomes or with similar populations or in similar settings). With research and cross-portfolio evaluation, the goal is less to build the capacity of grantees and more to support the field and to answer questions the foundation asks about our investments and strategies — such as what age cohorts are most influenced by certain interventions, what learning experiences and platforms are most meaningful, and many more. After more than 15 years of grantmaking, the foundation is working towards common measurements across grants that have related goals, objectives and desired outcomes; and towards conducting evaluations of cohorts of grants that have shared purposes and similar grant outcomes. The successful creation of shared outcomes and measures across the community-based teen initiatives of the Jewish Teen Education and Engagement Funder Collaborative created ripple effects across the field, and we have sought to build on lessons learned from that effort. Importantly, we know that this type of work cannot be done without the invaluable input, participation, and collaboration of the grantee-partners themselves.
Our research portfolio ranges from large to small and everything in between. Large, applied research studies include the recently completed CASJE study of the career trajectories of Jewish educators. This was a multi-strand, multiyear study that has the potential to impact the field for years to come. Other research studies are relatively small and quick, such as the Benenson Strategy Group market research survey of American Jews about their experiences of the High Holidays in 2020. Oftentimes the foundation invests in applied research in partnership with peer foundations to build on our collective knowledge gained from previous research efforts in both the Jewish and secular worlds.
Finally, the foundation has begun to gather more comparable information from grantee-partners directly. An annual survey of grantees provides aggregate information about participation rates (how many people were reached and how often), and to what extent these participants reflect the diversity of the Jewish community. For many in the field, these data collection methods are new and the language and questions themselves are appropriately evolving. But most important to us at this moment is whether our grantees are simply trying to collect important information about the diversity of their participants. If more and more say “yes, we know the answer to this question” instead of “we don’t know the answer because we don’t collect that information,” then we know that at least minimal progress toward a more inclusive Jewish community is occurring. At the same time, we understand that culture change, especially in areas such as DEI (Diversity, Equity, and Inclusion), does not happen without a commitment by the organization’s leaders. To that end, we include additional questions on our survey about the diversity of boards and senior leadership teams.
In any research endeavor, determining which questions to seek answers to and what methods can best garner those answers heavily informs how rigorous and quick the research efforts can be. Interpreting and processing findings, making connections between different research projects, and figuring out what findings need to be elevated for the field — and perhaps acted upon — are also vital components of applied learning. We accept that we don’t always have the luxury of a full set of answers to every question before we move forward. But we are intent, as we have always been, on turning evaluation and research into action. We also hold ourselves accountable to always be learning — to be in a constant state of asking, finding answers in appropriately rigorous ways and reflecting and acting. We welcome your feedback and insights on our approach to evaluation and research shared here, and information on your own efforts in this area of work too.
Stacie Cherner is director of research and learning at the Jim Joseph Foundation.