TEACHING NOTE: SI-56 TN

DATE: 06/01/13

TEACHING NOTE FOR
SAND HILL FOUNDATION
Susan Ford served as the president and cofounder of the Sand Hill Foundation, a family
foundation that made grants to organizations that benefited people on the San Francisco
Peninsula. Tom and Susan Ford established the foundation in 1995, reflecting from the Fords’
shared passion for giving and community development. The foundation focused on the
environment, education, preservation of open space, youth development and job training.

The Fords were among the original donors of the Teen Success Program, a support group for
teen mothers launched in 1990 by Planned Parenthood Mar Monte (PPMM). The program
encouraged teens not to have a second child and to stay in school, in exchange for $10 per week
and $100 for every 25 weeks of attendance. Facilitator-led Teen Success groups of up to 12 teen
mothers met weekly. Childcare was provided during meetings, and participants could remain in
the groups until they turned 18 or completed high school.

After investing more than $200,000 in the initiative, Susan Ford decided to measure the
effectiveness of the Teen Success Program. Her intention was to validate the program’s results
and identify its strengths and improvement opportunities to help it grow. Yet, even though Ford
had developed a positive relationship with Linda Williams, the head of PPMM, she worried that
Williams might feel threatened by her proposal for an assessment of the program’s impact. The
evaluation process resulted in tensions that caused both Ford and Williams to reflect upon the
dynamics of the grantor-grantee relationship, as well as the role of evaluation in their future
work.

By 2002, the Teen Success program was operating in over 20 communities in California and
Nevada and had served 625 teen mothers. That year, PPMM won the Planned Parenthood
Affiliate Excellence Award for services to teens. In mid-2002, PPMM was seeking funding for
another comprehensive Teen Success Program evaluation so that other Planned Parenthood
chapters could potentially replicate the initiative. Looking forward, Williams and Ford hoped to
capitalize on their learning to more constructively engage all stakeholders in the evaluation
process, effectively monitor the program’s impact and take action on evaluation results.
Copyright © 2013 by the Board of Trustees of the Leland Stanford Junior University. This note was prepared by
Lecturer Laura Arrillaga-Andreessen with assistance from Lauren Wechsler for the sole purpose of aiding
classroom instructors in the use of Sand Hill Foundation, GSB No. SI-56. It provides analysis and questions that are
intended to present alternative approaches to deepening students’ comprehension of the business issues presented in
the case and to energize classroom discussion.

Myrna Oliver. compared to the average of 50% of pregnant teens in the general population. located in Eastside (two groups). SI-75. Sand Hill Foundation president and cofounder. education. open space preservation.” Foundation News & Commentary. The Sand Hill Foundation invested more than $200.000 in PPMM’s Teen Success Program between 1990 and 1995. Web.Teaching Note for Sand Hill Foundation SI-56 p. particularly in San Mateo and Santa Clara Counties. Sunnyvale. Planned Parenthood Mar Monte (PPMM) director. By 2002. processes and implementation plans. Position in Course This case is intended for use in a course on philanthropic grantmaking or foundation strategy. By 1995. sponsored 24 support groups of 12 teen mothers per group per year and operated in 20 communities. According to Kramer’s evaluation. only 4% of Teen Success participants had become pregnant again. and Jane Kramer. The case highlights the Sand Hill Foundation’s efforts to measure the effectiveness of the Teen Success Program run by Planned Parenthood Mar Monte (PPMM). Teen Success evaluator. Mach/April 2001: 42-46 pp. Gilroy and Hollister. The teaching objective is to explore how to manage donor-donee relationships and to develop effective program measurement strategies. approximately one-third of Teen Success participants may have left the program because of a pregnancy. • Teen Success Program Evaluation: According to PPMM. Stanford Graduate School of Business. • Grantmaking Focus: The foundation focused on the environment. . youth development and job training. Supplementary Readings Arrillaga-Andreessen. • Key Players: Susan Ford. (2007). the program was serving 46 teens at five sites. Case Study. Mountain View. director of Teen Services at PPMM. Laura. 2 Key Facts • Mission: The Sand Hill Foundation’s mission was to make grants to organizations that benefited people on the San Francisco Peninsula. More than one in four Teen Success participants had continued her education beyond high school. compared to 33% of the general population of teen mothers. Robert Wood Johnson Foundation. Kramer determined that the true dropout rate due to pregnancy was likely to be somewhere between 12% and 30%. “Mutual Accountability and the Wisdom of Frank Capra. the Teen Success program had served 625 teen mothers. Emerson. At least 80% of Teen Success graduates had completed high school or received a GED. Web. Linda Williams. Case No. Jed. • Teen Success Program: The Teen Success Program was a support group for teen mothers launched in 1990 by PPMM.

Wilhelm. Web. Heifetz. John V.” The Chronicle of Philanthropy. “Report Cites Grant-Making Officers Who Forge Strong Relationships with Grantees.” Foundation News & Commentary. Assess Ford’s approach to initiating the Teen Success Program’s evaluation. Pablo. Timing for Class: 10-15 minutes for class discussion. Twersky. Kramer. “Leading Boldly. Timing for Class: 10 minutes for class discussion and brainstorming. 4. Web.” Chronicle of Philanthropy. Judith and Nancy MacPherson. Evaluate Kramer’s evaluation methodology. “Shared Outcomes. 2010. Ian. 3 Eisenberg. July/August 1999. Timing for Class: 10 minutes for class discussion. Eisenberg. “Foundations Can Learn a Lot From the People They Want to Help. Assignment Questions Primary Questions: 1. September/October 1983. Web. Timing for Class: 10 minutes for class discussion. Web. What are the potential issues Susan Ford faces in her desire to measure the Teen Success Program? a. 2. “We’ve Got Relationship Problems: How Can We Improve Grantee/Grantor Relations?” Foundation News & Commentary.” Stanford Social Innovation Review (Summer 2012). Fay.” Stanford Social Innovation Review (Winter 2004). . Ronald A. with 5-8 minutes allocated to each of the two questions posed above. 2011. What steps could Ford and Williams have taken to improve Planned Parenthood Mar Monte and the Sand Hill Foundation’s mutual learning? a.Teaching Note for Sand Hill Foundation SI-56 p. Timing for Class: 10 minutes for class discussion. As Williams looks to the future. how could Planned Parenthood Mar Monte improve its capacity to monitor the Teen Success Program’s impact? How could the Sand Hill Foundation support the program’s ability to measure and communicate results? a. “Philanthropic Ethics from a Donee Perspective. Web. Rodin.4. b. 6. Pablo. 40. November 13.Web. May 2.. Kania and Mark R. Supplementary Questions: 5. How could Ford use the lessons learned from the Teen Success Program evaluation to inform her future grantmaking related to teen pregnancy prevention? a. results and recommendations.

9. This way. e. Evaluate PPMM’s and Williams’ management of the evaluation process and results. • Have open conversations about evaluation purposes to create a collaborative culture. c. Timing for Class: 10 minutes for class discussion. both parties expect the evaluation from the beginning and can determine how to maximize mutual learning. Analysis for Primary Questions 1. Timing for Class: 10-15 minutes for class discussion and brainstorming. Discuss the general challenges that nonprofits have faced and still face in accountability and measurement. This would help build trust in the relationship so that the grantee will not assume that . The grantor can share examples of other evaluations that it has conducted with grantees. Ford indicated that the extent to which she fostered a relationship with grantees depended on “if we know the director and trust them. they could take a number of steps to improve their mutual learning: • Agree on evaluation in the initial grant contract. they had not engaged in direct discussion about the evaluation. Evaluate PPMM’s management of donor relationships. Cultivating an ongoing. How has the landscape changed in the last five to seven years? How might nonprofits improve their management of donor relationships? f. • Focus on building a strong grantor-grantee relationship with open communication. Timing for Class: 10 minutes for class discussion. Introduce the possibility of an evaluation during the proposal process rather than surprising the grantee with it partway through the funding cycle.” Williams noted that while she and Ford had been in periodic communication. What steps could Ford and Williams have taken to improve Planned Parenthood Mar Monte and the Sand Hill Foundation’s mutual learning? If Ford and Williams were to approach the evaluation process over again. 10. allocating approximately 5 minutes to each of the three questions posed above. Timing for Class: 7-10 minutes for class discussion. Another way that the grantor can foster open communication and trust is to suggest that both sides discuss past evaluation experiences. 4 7. Both parties could invest their energy in creating a culture of open.Teaching Note for Sand Hill Foundation SI-56 p. Evaluate Ford’s management of the evaluation process. 8. highlighting examples where the initial results might have been indicative of failure but where the foundation and grantees continued their funding relationships and worked together to learn from the assessment. Assess Ford’s approach to initiating the Teen Success Program’s evaluation. honest communication that values evaluation’s potential to further continuous improvement. d. consistent relationship could create a communication channel through which both parties could address concerns regarding evaluation and thus improve the mutual learning that results.

The grantor may involve the grantee in as many decisions related to the evaluation as possible. As the group facilitators are the leaders working directly with the teens and have the most direct program knowledge.Teaching Note for Sand Hill Foundation SI-56 p. they could be more actively involved in shaping the evaluation as well as helping to evaluate the program’s impact. As a starting point. Others might stipulate from the onset that they will provide a set amount of money. the grantee is more likely to “buy-in” to evaluation results rather than approach them defensively or dismiss them as inaccurate. the grantee will have the opportunity to share concerns stemming from past experiences or the current situation. Grantees could provide input on selecting the external evaluator. this financial commitment would show how seriously the foundation believes in both the assessment and the grantee’s continued work toward achieving their shared goals. Ford did her best to convey that the evaluation outcome would not prevent future Sand Hill Foundation funding. depending on what the evaluation determines. Williams did not seem to fully believe or internalize Ford’s view. Group facilitators could visit each other’s sessions to help learn from one another. defining the evaluation methodology and identifying what issues or challenges may arise during the evaluation. inquiring about the benefits that PPMM hoped to gain from the evaluation and any top priority questions the organization wished to include. By fostering a transparent relationship that includes conversations about lessons learned from past evaluation. 2. • Further integrate the perspectives of the end recipients and the program operators into the evaluation design and implementation. PPMM needs to determine what information . Either way. By approaching the evaluation collaboratively. 5 continued funding depends on a “perfect” assessment. • Commit foundation funding to implement at least some of the evaluation results. how could Planned Parenthood Mar Monte improve its capacity to monitor the Teen Success Program’s impact? • Determine key metrics. Finally. the teens themselves could be more proactively involved in explaining and defining what success looks like for the program. it will help the grantees to understand the grantor’s perspective on the importance of evaluation in informing future decision-making and program strategy. • Make as many decisions about the evaluation together as reasonably possible. it could consider providing the financial and intellectual capital necessary to implement at least some of the evaluation recommendations. evaluate session practices and work together to develop a consistent model based on best practices. Additionally. without a more developed dialogue. if the foundation truly envisions the purpose of the evaluation to help grantees better achieve their objectives. Additionally. Some foundations may be willing to leave this amount open. Ford commented that in retrospect she would have given Williams a greater voice in the process. • Discuss the grantee’s past evaluation experiences. which was how Williams and her staff initially reacted to the evaluation that Ford commissioned. As Williams looks to the future. However.

The Sand Hill Foundation could support PPMM’s ability to measure and communicate results by advocating for the organization’s proposed single “dashboard” of data reporting. or provide PPMM with the resources to do so itself. If PPMM identified an essential data set to measure program success. In order to collect and assess data. • Create and support better systems for data collection. Coordinating and streamlining reporting to different funders could free up staff time to focus on data collection and program evaluation. • Designate one staff member to oversee all data collection. the Sand Hill Foundation could provide financial capital to hire and/or train the data collection staff member. facilitator observations. it could share this information with all funders and recommend a common reporting system. emails. As the evaluation indicated.Teaching Note for Sand Hill Foundation SI-56 p. PPMM staff indicated that they were “so busy” implementing the program that they did not have time to think about evaluating their impact. The . Additionally. • Sand Hill Foundation support for PPMM’s communication of results. let alone communicating their results. How could the Sand Hill Foundation support the program’s ability to measure and communicate results? • Sand Hill Foundation support for PPMM’s data collection. monitor participant needs and generate relevant and accurate program statistics. • Streamline and simplify data reporting to funders. namely detailed reasons and outcomes for teens that leave the program. One way to do this would be to create an evaluation dashboard highlighting progress along key outcome indicators. This would minimize time spent on reporting and enable staff members to focus on consistent program evaluation and strategy improvements. the foundation could seek out venues for PPMM to meet with other funders and nonprofits working on teen pregnancy issues in order to advance field-wide learning. the Sand Hill Foundation could write articles about PPMM’s strategy. If multiple people are monitoring the teens and their groups. If PPMM still feels that it needs additional staff dedicated to data collection and program evaluation. PPMM could improve its capacity to monitor the Teen Success Program’s impact by designating one staff member to oversee data collection and management. In the business case. there is too much risk that the data collected will be inconsistent. The organization needs to allocate part of a staff member’s time to oversee the collection and analysis of consistent and reliable data. The evaluation indicated particular data that PPMM could capture. To help PPMM communicate results to other funders and teen pregnancy-related organizations. Such an IT system could aggregate participant data and enable PPMM senior managers to readily access this data and analyze programs. online responses and even text messages (since teens may be more likely to have a cell phone and use text messages to communicate rather than emails). PPMM could install and leverage an IT system that integrates participant questionnaires. evaluation and evolution. 6 it will consistently collect and determine efficient and effective methods to aggregate data across teen groups.

…The retention of teens appears to be strong. 3. graduation rates and participant satisfaction. “Calculations based on 25 teens that were reached for follow-up interviews indicate that approximately one-third may leave the program because of a pregnancy. o It is also important to fund an evaluation at a level that enables the data collection to be sufficiently robust. 33% of the general population of teen mothers had a second pregnancy within two years of giving birth to their first. The Teen Success Program may aim to reduce this rate by a certain percentage and in the very least. Record keeping is critical to evaluate program impact and identify its strengths and weaknesses. 7 foundation could help by identifying appropriate communication forums such as field publications and conferences and providing the financial and intellectual capital to ensure that PPMM is represented. For example. o It is important to gather benchmarks. it is important to identify and highlight program strengths in order to motivate program staff and determine what program elements will be maintained in the future. uncover problems and adapt program design to better meet expectations.Teaching Note for Sand Hill Foundation SI-56 p. the number of graduates is impressive and the teens appear to be extremely satisfied with the program. this data was essential to determining how program strategy could be adapted to retain more program participants. demonstrate that among mothers participating in the Teen Success Programs. where available. according to PPMM (and publically-available data). For example. o It is also critical to identify upfront the metrics that will be most important to track in order to demonstrate whether or not a philanthropic program is achieving . The true dropout rate due to pregnancy is likely to be somewhere between 12% and 30%. How could Ford use the lessons learned from the Teen Success Program evaluation to inform her future grantmaking related to teen pregnancy prevention? • Lesson from evaluation: Key program strengths were retention. the rate of second pregnancy is lower. it is important to allocate funds for proper and thorough record keeping. • Implication for future grantmaking: o Whether initiating a new program or funding an existing one. providing an adequate sample size to test hypotheses and gain accurate results. According to the evaluation. a sample set of 25 teens may not be a large enough percentage of the overall population (of hundreds of teens) to constitute a sample that can represent Teen Success Program results.” • Implication for future grantmaking: o When conducting and releasing a philanthropic program evaluation. Record keeping also enables program staff and funders to identify divergences from anticipated results. to find out if the funded intervention is actually improving conditions or resulting in the same situation that would have occurred without the intervention. • Lesson from evaluation: PPMM maintained poor records on teens that dropped out of the program. PPMM’s estimates are lower than this (approximately 4% or approximately five of 124 teens). However.

Program facilitators could receive more guidance and training about consistent program design in the future. This would also help to establish benchmarks that PPMM can use in evaluating its program’s success and making strategic updates. consistency and quality. o Future grant-funded programs could consider developing a program guide for all staff members that outlines clear expectations and best practices for the program. . • Implication for future grantmaking: o When operating a program that involves multiple facilitators. one staff member could be assigned to oversee all data collection and aggregation. Ongoing stakeholder feedback will help verify that the nonprofit’s programs are functioning as intended and will help determine whether current strategies effectively achieve program goals. data collection could be more consistent and complete than if everyone independently gathers data according to their own criteria. With this structure. high-quality programs and ensuring that outcome data can be compared across groups. it is imperative that one person has oversight responsibility for collecting. o For future grantmaking. • Lesson from evaluation: Facilitators’ curricula and expectations for program implementation and performance varied across sites. it is critical to understand why a teen might drop out of the program (and determine whether or not this was due to a second pregnancy). Consistency in training and goal setting across groups is critical for implementing effective. managing and analyzing data. these programs could harness the network of similar organizations to learn about ideas that have been tested and best practices that have emerged. • Implication for future grantmaking: o Teen pregnancy prevention programs could share their best practices and work continuously to improve their offerings so that teens stay in their programs. Nonetheless. This cost could be highlighted in the overall program budget so that other donors that support and replicate the program fully fund this expense. focusing particularly on teens that get pregnant while in the program and/or pre-maturely dropout of the program. • Lesson from evaluation: Too many different people conducted data collection. However.Teaching Note for Sand Hill Foundation SI-56 p. it is important that everyone receives the same baseline training and works toward the same data collection goals. • Lesson from evaluation: Programs like PPMM could use self-assessment to evaluate their impact and evolve programming to meet their desired outcomes. all staff members could be educated about the importance of evaluation and knowledge management. It is not realistic for every teen pregnancy prevention program to conduct its own research. • Implication for future grantmaking: o Because of the importance of collecting thorough and reliable data. Instead. rather than having all facilitators track assessment metrics. This would improve evaluation clarity. As the most important goal for the Teen Success Program is preventing a second pregnancy. 8 its goals. Ford could provide the funding (or partial funding) for a centralized staff member to collect and analyze data across the program.

She didn’t necessarily disagree with the support group method but did suggest additional ways to engage participants. Analysis for Supplementary Questions 4. • Strengths: Overall. • Williams’ desire to be heavily involved in designing the evaluation. as the program had remained unchanged for five years despite uncertainty as to whether it was achieving its goals. • Difficulty in tracking teens and gathering relevant data.Teaching Note for Sand Hill Foundation SI-56 p. Kramer used typical evaluation methods that combined both qualitative and quantitative research. it is important for every teen pregnancy prevention program to establish clear metrics for success and collect data accordingly. • Defining inputs. 5. outputs and outcomes to better understand the dependencies created by the program design and assess the program’s overall impact. In order to achieve successful outcomes. the program had not adequately collected data. Evaluate Kramer’s evaluation methodology. the program’s impact was unclear. 9 • Lesson from evaluation: PPMM needs to determine if the program effectively prevents teen pregnancy. • PPMM resource constraints—evaluation taking up too much time and human resources. Kramer’s evaluation was successful in that it challenged PPMM’s assumptions. Up until that point. • Measurement cost issues. At the time of the case. results and recommendations. • Internal versus external evaluation. • Opportunities for Improvement . Kramer correctly pointed to data collection as a serious problem with the Teen Success Program. Appropriately. Kramer focused on the support groups’ content. thus biasing process and results. facilitator training and data collection. • Challenges in designing a long-term measurement program that transcends one-time use. Finally. Kramer was correct in suggesting a reassessment of the Teen Success Program’s vision and goals. which made it difficult for the program to measure its success and identify potential areas for improvement. • Implication for future grantmaking: o Ultimately. teen pregnancy prevention programs may rigorously evaluate themselves to determine if their current programs are effective. What are the potential issues Susan Ford faces in her desire to measure the Teen Success Program? • Resistance from Williams due to fear of losing funds.Evaluation Methodology: Kramer could have visited the . • Williams feeling threatened that Ford might become too involved in PPMM and the program. The questionnaires sent to both existing Teen Success participants and those who had left the program contained targeted and relevant questions. Every teen pregnancy prevention program could aim to have a clear understanding of the extent (if and how much) to which it is preventing teen pregnancy.

Kramer could have also surveyed the teens at the beginning of the support groups and at a later date to measure changes in teen attitudes over time. Ford could have also taken the opportunity to require regular evaluations of the program. at a minimum. but Williams and PPMM may not have felt as defensive. as Oliver had suggested. In the end. Kramer could have included additional recommendations on data types to share regularly with PPMM’s donors and advice for managing donor relationships. It was important that Ford made an effort to measure the effectiveness of the Teen Success Program. 6. participant pregnancy rate. Specifically. she may have selected Kramer to conduct the evaluation. She managed the process well but she could also make future improvements in a few areas. Although Ford’s desire to collect unbiased results was good. Ford also could have structured the . Kramer could have provided a “non-letter” template for all of PPMM’s donors. This template could include basic information such as the number of Teen Success support groups. 10 support groups over a longer period of time to gauge changes in the teen’s attitudes and to experience a greater variety of program curricula. Williams may have trusted Ford when Ford emphasized that she and the Sand Hill Foundation would not pull funds. Ford rightfully admits that she could have. Suggestions could focus on institutionalizing relationships to transcend a natural tendency for ad-hoc relationships. Williams’ involvement would have also demonstrated to PPMM that Ford genuinely believed that the program’s success relied on being a joint donor-donee effort. updates on program strategy and vision. Ford could have stated in the original grant agreement both the type of reporting the Sand Hill Foundation required and the frequency with which grantees would report. • Including more specific goals in the evaluation design. even if the results were less than satisfactory. Evaluate Ford’s management of the evaluation process.Teaching Note for Sand Hill Foundation SI-56 p. After having engaged in the evaluation. • Opportunities for Improvement – Recommendations for Improved Donor Relations: Collecting data was critical for donor development and internal organizational development and growth. Kramer could have also provided recommendations on how PPMM could better manage its donor relationships. Examples include regular face-to-face donor-grantee meetings. • Including grantee perspective in the evaluation design and implementation. growth since the last report. future support groups and target expansion regions. If Ford had included Williams in the process earlier on. she probably would have benefited from soliciting Williams’ opinion and input from the start. program improvements. joint strategy sessions to solicit feedback and new program ideas and donor events with Teen Success participants and alumnae to celebrate program success. Room for Improvement • Including evaluation and reporting requirements in the original grant agreement. percentage of participants currently enrolled in school. number of total participants. solicited Williams’ thoughts on evaluation methods and potential evaluators.

She served as a sounding board for both the evaluator and PPMM. PPMM could have initially been more open to the results and less defensive. help PPMM make more informed decisions and communicate the program’s value to current and potential donors. To show its initiative and its desire to continuously improve the Teen Success Program. as well as the overall impact assessment was not as clear or effective. PPMM could also provide regular progress reports offering updated data on retention and dropout rates for teens in the program.e. 7. the data collection and analysis process. Ford successfully managed the process. selecting one of the following: goals-based evaluation— focused on the program’s ability to achieve its predetermined objectives. • Using evaluation as a means for program improvement.org/library/evaluatn/fnl_eval. However. Evaluate PPMM’s and Williams’ management of the evaluation process and results. They felt threatened by the evaluator and openly disagreed with the evaluation results. PPMM could institute formal program evaluations on a consistent basis. Williams and PPMM managed the evaluation process in a manner indicative of a threatened organization. as it subsequently admitted. Williams often left a very positive impression on donors. they overreacted to the study’s results.htm#anchor1581634 Successful Management • Listening and learning. • More formalized and consistent interactions could improve grantee-grantor relationships. In terms of evaluation reporting.mapnp. Williams and Oliver could develop an internal reporting system that would expand the organization’s knowledge base. Eventually. Many organizations believe that an evaluation is about proving a program’s success or failure. • Program measurement and evaluation could be more institutionalized. Carter McNamara. Kramer’s evaluation was not as specific about its goals and as a result. However. Williams and PPMM saw the value of the evaluation and instituted change within the program. 8. More information on such topics can be found through the “Basic Guide to Program Evaluation (Including Outcomes Evaluation)” developed by Dr. As program and impact assessment is a key part of any successful fundraising strategy. 11 evaluation more specifically (i. • Defensive behavior limited the opportunity for learning. PPMM could develop more . She listened to PPMM’s issues with the evaluation and remained calm in a potentially relationship- damaging situation. Authenticity Consulting LLC: http://www. which in fact provided PPMM with reasonable suggestions for improvement. Donors such as Susan Ford trusted Williams and enjoyed working with her. inconsistent data and anecdotes about successful teens. She also took the opportunity to provide her own suggestions to Williams on how to improve the Teen Success Program. process-based evaluation—understanding how the program really works and its strengths and weaknesses or outcomes-based evaluation—identifying client benefits).Teaching Note for Sand Hill Foundation SI-56 p. instead of the periodic letter with random information. Evaluate PPMM’s management of donor relationships.

As a result. as Williams openly admits had happened recently. • Nonprofit accountability and measurement is now becoming a high-priority in nonprofit management. • Lack of clear metrics measuring performance.Teaching Note for Sand Hill Foundation SI-56 p. the majority of nonprofit leaders and donors independently define the metrics that are most likely to measure a nonprofit program’s social value. Discuss the general challenges that nonprofits have faced and still face in accountability and measurement. accountability is largely self-imposed. Williams could take the initiative and set up semi-annual or annual meetings between PPMM team members and major donors at which PPMM could discuss the Teen Success Program’s progress and future goals. nonprofits have found it difficult to attract experienced leaders and employees with solid business acumen. This is for a variety of reasons. Additionally. In the nonprofit sector. Such relationships can easily fall to the wayside during busy times. potentially increasing their donations. How has the landscape changed in the past five to seven years? How might nonprofits improve their management of donor relationships? • In the nonprofit world. 9. With so many different global social issues. which can be time consuming and costly. Such action would give PPMM a competitive advantage since so few nonprofits currently take such action. a company’s value and performance are ultimately measured by a single consistent metric – the dollar. influenced by myriad internal and external factors. The organization needs to take the initiative and establish a template that includes all relevant information to preempt each donor from asking for different information. 12 formal relationships with important donors such as Ford. In the nonprofit world. donors will inevitably be impressed by PPMM’s initiative and vision. This . no such single measure of value exists. metrics can demonstrate achievement of social impact in infinite ways. investors and stockholders demand results and an understanding of how resources invested in a company are delivering value. PPMM needs to institutionalize reporting for major program funders. including: o Nonprofit management as an academic field is growing at the graduate level. • Lack of business expertise in nonprofit leadership. In the absence of accepted standard metrics. No market-driven mechanisms exist to force nonprofits to demonstrate that the resources they manage and the strategies they employ are in fact achieving established goals. In the for-profit sector. Due to limited financial rewards and often demanding working conditions. In the for-profit world. • Data sharing could improve and streamline donor reporting. donors often do not demand proof of their donations’ results and are satisfied merely with feeling good about giving to a worthy cause. Lack of business expertise in the nonprofit sector has meant that critical areas such as accountability and measurement have often been overlooked. This session could also be a time to solicit donors’ thoughts on future program opportunities and strategic evolution. donors rarely conduct enough research on organizations before making a gift and few even know what might indicate whether or not a nonprofit is a high performer. As mentioned previously.

mobile message or mail to proactively let donors know of the current impact created through their collective donations as well as share lessons learned and future priorities. As growing numbers of nonprofits do this.Teaching Note for Sand Hill Foundation SI-56 p.0 website (giving2. Examples include: o Dynamically (in a timely.com) for Stanford Graduate School of Business Lecturer Laura Arrillaga-Andreessen’s complete portfolio of philanthropy Stanford Graduate School of Business case studies. o A new generation of wealthy donors who built their fortunes through competitive business practices are demanding and expecting high levels of accountability. 13 presence in academia has challenged the status quo and has brought forth a wealth of new nonprofit strategies and processes for accountability and measurement. teaching notes. These actions would also provide greater transparency to donors and build donor confidence in the nonprofit’s commitment to continuous improvement and accountability. frameworks and learning resources that she has created since 2000. reporting and results from the organizations to which they donate. creative and forthright manner) sharing the latest impact to date (including outputs and outcomes achieved. o Technology now enables nonprofits to track their impact and report results to donors more easily. • Nonprofits can improve their donor relationship management to promote a culture of accountability and measurable impact. This would invite others to further discuss and develop these metrics and share best practices across the industry. more donors are expecting nonprofits to readily provide feedback on social impact. . Teaching Approach The Sand Hill Foundation case study is appropriate for a 45-minute teaching module including both a lecture and a discussion. o While the entire sector continues to struggle to find good ways to measure social impact. o Sending out quarterly or biannual updates via email. as well as money raised and invested in critical programs and operations) on the nonprofit websites so that donors and prospective donors can see the progress achieved at any point in time. nonprofits could publicly share the metrics they use to track program success on their websites or in relevant publications. This may result in more frequent and open discussions of how philanthropic dollars are being used and what impact they are making. Donors’ increasing demand for nonprofit transparency could shift both donors’ and nonprofits’ expectations. Key themes for discussion include: • Evaluation • Measuring impact • Grantor-grantee relationships • Mutual accountability • Professionalization of the philanthropic sector Please see the Giving 2.