Guest Blog

A Different Kind of Risk-Taking: Improving Evaluation Practice at the Jim Joseph Foundation

– by Cindy Reich

September 16th, 2015

A version of this blog originally ran in Philanthropy News Digest

“We’re in the business of risk-taking,” is a frequent refrain of Chip Edelsberg, Executive Director of the Jim Joseph Foundation. Generally speaking, Edelsberg’s notion of risk-taking refers to the investments the Foundation makes in its grantees and their programs. The Jim Joseph Foundation is a foundation with assets in the range of $1 billion whose mission is to foster compelling, effective Jewish learning experiences for young Jews.  Between 2006 and June, 2014, the Foundation granted over $300 million to increase the number and quality of Jewish educators, to expand opportunities for Jewish learning, and to build a strong field for Jewish learning (Jim Joseph Foundation, 2014). Rarely is there an established research base for the kinds of initiatives the Foundation supports in Jewish education. In the spring of 2013, though, Edelsberg had another kind of risk in mind.

What might be gained, Edelsberg ventured, if the Foundation staff brought together a group of competing evaluation firms with whom they had worked in the past to consider ways to improve the Foundation’s practice and use of evaluation? The idea had emerged from a study of the history of the Foundation’s evaluation practices from its inception in 2006 through 2012, commissioned by the Foundation and conducted by Lee Shulman, President Emeritus of The Carnegie Foundation for the Advancement of Teaching and Charles E. Ducommun Professor of Education Emeritus at Stanford University. Edelsberg thought it was a risk worth taking, and the board of the Foundation agreed. Edelsberg made another bold decision—to allow a doctoral student in Evaluation Studies from the University of MN to study this experimental venture.

In the winter of 2013, a colleague of mine from the field of Jewish education who was then a staff member of the Foundation heard about my research interest in the role evaluation plays in the work of foundations and their grantees. She offered to connect me with Edelsberg because of the Jim Joseph Foundation’s interest in and commitment to evaluation in their work. Edelsberg described the idea for what became the “Evaluators’ Consortium” and I asked about the possibility of studying the process as a case study for my dissertation. By the time the consortium met for the first time in October of 2013, and with the agreement of the Foundation board and the participating evaluators, I launched the research. The purpose of the study was to explore what occurred when a foundation inaugurated an innovative approach to evaluation practice, examining factors that supported successful implementation of the innovation and the impediments to its success. It sought to provide insights into the elements of organizational culture, practices, circumstances, and structures that can support effective practices of evaluation in the foundation sector. The Foundation gave me access to documents and invited me to observe meetings of the Consortium held both in person and electronically. Over the course of the first year of the Consortium’s operation, I interviewed all Foundation program staff members, Shulman (who served as the facilitator), a member of the board, and each of the participating evaluators.

In the initial stages of the work, the goals for this experiment were general and somewhat vague. The Foundation hoped to establish a more efficient process for selecting evaluators for foundation grants, to stimulate collaboration among the evaluators, to explore possibilities to conduct cluster evaluations or meta-analyses, and to examine ways the foundation could improve its overall program of evaluation.  One hope was that in their coming together, the evaluators would help the Foundation define an agenda for their work together. In spite of the uncertainty of the initiative’s outcomes, all the evaluation firms that were asked accepted Edelsberg’s invitation to participate—a testament to the nature of the relationship they already had with Edelsberg and the Foundation, and an indication of what a deeper relationship with the Foundation meant to the evaluators. The Consortium met for two face-to-face gatherings and two web-based conferences, and there was email communication among the participants between convenings. When the group gathered, members of the Consortium shared samples of their work with one another.

There was some discomfort among participants about the initial lack of clarity about the outcomes and timeline of the Consortium, especially since the evaluators were participating without compensation. Both Foundation staff and evaluators wondered how long they would be able to continue without a clear focus.  An idea that emerged toward the end of the first gathering gained traction in the months leading up to the second meeting—what if the group developed a set of outcomes and measures for Jewishness (or Jewish identity/growth/development) that could be used across organizations, initiatives, and programs? Nothing like this existed in the field of Jewish education. The notion of a tangible product, one that could be used by the evaluators, by the Jim Joseph Foundation, and by the field at large, had broad appeal. There were some concerns about committing to this goal among the evaluators–while worthwhile, such a goal was ambitious, difficult, and time consuming to achieve.

The Consortium’s work on measures of Jewish growth came at a critical time for the Foundation. At about the same time as the Evaluators’ Consortium was launched, the Jim Joseph Foundation had begun work on one of its most large-scale projects to date—the Initiative on Jewish Teen Education and Engagement. The initiative linked directly to the Foundation’s mission to “foster compelling, effective Jewish learning experiences” for teens and young adults. The initiative’s strategy included working in partnership with funders in up to ten local communities in the US to incubate new models of learning and involvement for Jewish teens. It grew out of an understanding of the importance of this stage of the life cycle in human development coupled with a reading of the data on low participation rates of Jewish teens in the Jewish educational experiences available to them in their communities (Informing Change, Jim Joseph Foundation, & Rosov Consulting, March 2013). On the eve of the launch of the Teen Initiative, the Foundation was particularly interested in measures of Jewish growth that could play a role in evaluating the work within and across communities.

Over the course of the first year of work, the Consortium helped the Foundation develop the vision for a cross-community evaluation of the Teen Initiative, including more in-depth work on outcomes and measures of Jewish growth.  In a step unprecedented for the Foundation, the staff asked the members of the Consortium for feedback on a draft of the evaluation RFP, and made changes on the basis of their suggestions. At the end of the year, the Foundation awarded a million dollar, four-year contract to two of the participating firms to conduct the cross-community evaluation. Another member of the Consortium is participating as a consultant on pieces of that work. The fourth member of the Consortium has been contracted by several of the local communities to conduct their community-based evaluations.

In addition to shaping of the cross-community evaluation and taking first steps on the development of outcomes and measures of Jewish growth, the initiative produced several other outcomes for the Foundation and for the participating evaluators. The foundation clarified its ideas about effective evaluation practices.  Foundation staff members developed the capacity to think differently about evaluation. Relationships were strengthened between Foundation staff and evaluators and between individual evaluators and evaluation firms. The initiative created relationships among competitors who entered into collaboration with one another to their own benefit and to the benefit of the Foundation and its grantees. Through its success with the Consortium the Foundation was emboldened to consider other new approaches to evaluation. Finally, as a result of the work done with the Consortium, the Foundation was able to introduce evaluators and high quality evaluation practices to other funders and communities.

The data collection for my dissertation came to a close in August of 2014, nearly a year after the first convening of the Jim Joseph Foundation’s Evaluators’ Consortium. Since then, the Consortium has continued to meet. Their current goals, according to a Foundation blogpost written by Sandy Edwards and Stacie Cherner (2015) of the Foundation staff, include:

  • A plan for researchers, funders and practitioners to agree on common constructs [of Jewish learning and growth];
  • The development of a set of standardized questions that can be utilized across the Foundation’s portfolio of grantees;
  • Field testing of a “universal toolkit” for collecting data on common outcomes and demographics;
  • A plan for longitudinal testing, and recommending resources to disseminate and encourage the use of universal sets of tools.

Various factors supported the success of the Consortium. One was the Foundation’s willingness to take a risk and to anticipate the possibility of failure. A learning culture at the Foundation and a commitment to field building were other contributing factors. Another contributing factor was the Foundation’s ongoing approach to evaluation.  Program officers work in partnership with grantees to develop evaluation RFPs and to hire evaluators; the Foundation then funds the evaluation of their grants. Members of the program staff are engaged in nearly all stages of the evaluations of grants they manage. The staff cultivates relationships with the grantees and evaluators with whom they work. The Foundation is committed to learning from evaluation, not just using it for accountability. They use evaluation for making decisions about grantmaking. The Foundation shares the majority of completed evaluation reports on its website.

To understand the success of the Consortium we also must consider its leaders and its participants. The Foundation’s professional leader, Chip Edelsberg, his commitment to the initiative in particular and to evaluation in general, and his ability to cultivate relationships with others played important roles. Also critical were the intellectual leadership and facilitation of Lee Shulman. For the participating evaluators there were benefits to participating—possibilities of evaluation contracts with the Foundation, enhanced relationships with Foundation staff, and opportunities for professional development and colleagueship. These incentives certainly encouraged participation, and all those invited agreed to participate. It was no small feat, though, that these evaluators agreed to work alongside the organizations with whom they compete for contracts, to share their expertise with one another, to participate without direct compensation, to engage without promises of future work—and to do so with an uncertain timeline and undefined outcomes in the early stages. The small size of the field of Jewish education and the sub-field of evaluation of Jewish education added other facilitating factors—the players were known to one another at least by reputation even if they did not know each other personally and the impact of the work of these participants had the potential to be felt in the field.

Establishing the Evaluators’ Consortium required overcoming a number of challenges.  The logistics involved in scheduling the leadership of the four evaluation firms took much longer than the Foundation anticipated. Some of the evaluators worried that the outcomes were not clear at the beginning, nor was the timeline. Some worried about the scope of the project and the amount of time they had to give. Some were concerned that the competing firms might be reluctant to be fully open and comfortable working with their competitors. While the Foundation worked to create an atmosphere of collegiality among all the participants, the power differential between the Foundation and the others operated beneath the surface.

The model of the Evaluators’ Consortium is worthy of consideration by other foundations engaged in strategic philanthropy. It is likely, however, to demand practices that are a departure from “business as usual.” Strategic philanthropy involves specifying outcomes in advance and looking at progress against those outcomes. When contemplating this type of innovation in the practice of evaluation, a foundation ought to be aware of the need for emergent goals and uncertainty. Not only is it impossible to specify all possible outcomes of an innovation, attempting to define the outcomes may limit the foundation’s consideration of promising courses of action. Working in an emergent way requires some faith in the process, trust in the people promoting the innovation, and some concrete promise of potential benefits. It requires a champion who is willing to take risks and to bring others along an uncharted path. The use of developmental evaluation to document and learn systematically about the work as it progresses could address strategic concerns.

It may be counter-intuitive to bring together competitors to work together on behalf of a foundation’s evaluation program. Convening competitors in a collaborative venture, though, can create capacity, build networks, and magnify potential outcomes. Careful consideration needs to be given to the conditions under which collaboration is done, who facilitates it, and what expectations are established throughout the process. Cultivating relationships is a critical step in introducing and sustaining innovation in evaluation practice.

Innovating in the area of evaluation practice through the convening of evaluators, staff, and outside experts requires a commitment of staff time and attention for a range of tasks from engaging potential participants to defining questions to address to making arrangements. Making staff available for this work may require the shifting of responsibilities and priorities among staff members. Financial resources are another consideration. It may not always be possible to draw on the good will and trust or even the promise of future contracts with a foundation or its grantees to induce evaluators to participate in an undertaking like the Evaluators’ Consortium. Foundations considering the use of this model ought to establish a budget that would allow for compensation of the participants.

The practice of risk-taking is central to the work of foundation leaders as they hone their strategies, strive to make effective investments in organizations and programs, and pursue their missions of social betterment. The model of the Evaluators’ Consortium is a risk worthy of consideration by foundation leaders. Working collaboratively with a diverse group of external evaluators who bring a range of skills, perspectives, and expertise has potential for significant pay-offs for foundations and, ultimately, for the spheres they hope to impact.

Cindy Reich is an evaluator and Jewish educator based in Minneapolis, MN. This article is based on her dissertation, Improving Evaluation Practice in the Foundation Sector: A Case Study of the Jim Joseph Foundation’s Evaluators’ Consortium, scheduled to enter library circulation in spring 2016. She received her Ph.D. in Evaluation Studies from the University of MN in 2015.

 

Bibliography

Informing Change, Jim Joseph Foundation, Rosov Consulting (2013, March) Effective Strategies for Educating and Engaging Jewish Teens: What Jewish Communities Can Learn from Programs That Work. Retrieved from https://jimjosephfoundation.org/wp-content/uploads/2013/03/Report_and_Appendix_Effective_Strategies_for_Educating_and_Engaging_Jewish_Teens.pdf

Jim Joseph Foundation. (2014). Jim Joseph Foundation 2013-2014 Biennial Report . San Francisco, CA, USA.

S. Edwards & S. Cherner. (2015, April 9). A Behind-the-Scenes Look at an Evaluators’ Consortium (Blogpost). Retrieved from https://jimjosephfoundation.org/a-behind-the-scenes-look-at-an-evaluators-consortium/