Blog, News & Events

Making Impact Assessment Meaningful and Manageable: Insights from the MIT Scaling Development Ventures Conference

blog
Wednesday, April 22, 2015
Elizabeth Hoffecker Moreno

The following is one of a series of blogs about presentations, workshops, and panels that were part of the 2015 MIT Scaling Development Ventures conference. Elizabeth Hoffecker Moreno (pictured right) is IDIN's Research Coordinator. 

A persistent challenge facing social enterprises and mission-oriented ventures of all sizes is to develop ways of effectively measuring and communicating impact.  Defining what the desired impact is, developing and implementing reliable methods for assessing if it is occurring, and reporting results transparently to stakeholders takes considerable effort and can feel daunting to start-up enterprises and established organizations alike.  As a result, impact assessment has a tendency to become overly complex, burdensome and costly, or to remain under-prioritized within even relatively successful organizations. 

As part of the recent MIT Scaling Development Ventures Conference, my D-Lab colleagues Kendra Leith, Laura Budzyna, and I organized a panel discussion to gather insights from organizations that are approaching impact assessment in innovative yet practical ways. We were interested in learning what metrics they had found most useful, whether they had developed their own custom metrics, or were using industry-wide metrics such as the Global Impact Investment Rating System (GIIRS) or the Impact Investment Rating System (IRIS), and what advice they had for organizations looking to increase the usefulness and efficiency of their impact assessment efforts. We wanted to hear from donors and investors who use impact metrics to decide who to invest in and what work to continue supporting, as well as from organizations that perform internal impact assessments to improve operations and report to stakeholders. 

To that end, we invited four dynamic panelists to MIT on Saturday, April 10 to join us for a breakout session on “Measuring Outcomes and Impacts on the Path to Scale.”  The Mulago Foundation, represented by its director Kevin Starr, uses impact metrics as a cornerstone of their investment approach and carefully assesses the metrics of development ventures before deciding whether or not to fund them. “We don’t invest in anyone who doesn’t measure their impact,” says Kevin. “If they don’t do it, we can’t, and if they’re not, they’re flying blind.” Unlike most philanthropic donors or social investors, Mulago does not request formal funding proposals or impact reports designed specifically for them: “The last thing we want is to waste the time and energy of those who are trying to save the world. We ask for annual milestones and their impact methodology; beyond that, we rely on documents they should already have on hand.”

Echoing this move to make impact assessment more meaningful and less burdensome, Kasia Stochniol from Acumen shared their most recent initiative in impact measurement, Lean Data. By focusing on gathering data through low-touch methods such as mobile networks and existing interactions between entrepreneurs and customers, Acumen is experimenting with methods of gathering impact data that has higher relevance to the enterprises they invest in at a fraction of the cost of traditional surveys. Acumen is partnering in this initiative with Root Capital, a smaller social investment nonprofit represented on the panel by Mike McCreless, Director of Strategy and Impact. Over the past several years, Root Capital has re-thought their approach to assessing the impact of their lending and programs, adopting a “client-centric” approach. By integrating a client “as an equal stakeholder in the data gathering process, [Root Capital] can do impact research while gathering data that is useful to the producer enterprises” served by their loans. 

To balance the investor perspective, we invited a social entrepreneur to join the panel.  Bilikiss Adebiyi-Abioloa is the CEO and Co-Founder of Wecyclers, a mission-driven start-up in Lagos, Nigeria that offers low-income households a chance to capture value from their household waste through redeemable points, while providing a reliable source of recycled materials to local industry and an environmentally-friendly response to the crisis of waste management facing this city of 18 million, which generates 10,000 pounds of waste per day. Wecyclers has developed an innovative and low-cost method of integrating the collection of their most valuable impact metrics with the same software platform they use to weigh trash upon pick-up and award SMS-based points to households that give them recyclables.  

Four Insights from the Panel
The conversation among these four panelists was fascinating and instructive. Below are four insights I gleaned from their combined experiences, which can be used by organizations of all sizes to strengthen and streamline impact assessment efforts:

1.    Focus on results at three levels. 
Mulago recommends a simple, eight-word mission statement (what problem you’re tackling, for whom, and what the desired change will be). Then, focus on obtaining accurate data that can answer these three questions: 1) Are we delivering our service/product/approach well? 2) Is behavior changing as a result (is there some effect)? 3) Is that behavior change producing the desired social/environmental/development impact? It is important to have verifiable data at each of these levels as well as convincing evidence that your intervention has caused the observed behavior change, and that behavior change has caused the observed impact. Attribution can be demonstrated through a compelling narrative (clear logic and methodology) as well as through studies with matching controls or formal experiments such as randomized control trials.

2. Gather the data you need to perform your best.
Instead of designing impact assessment around funders’ needs, focus first on developing a system for obtaining the numbers that will tell you clearly if you’re accomplishing what you set out to do. “When it comes to data collection, we’re selfish,” said Bilikiss. “We focus on what is going to make us better.” In the case of Wecyclers, that’s data on how much trash they’re collecting, the types and sources of that trash, and the demographics of who is recycling (they found that 80 percent of their recyclers are women and that many recyclers use Wecyclers as an extra income source and not just a household waste management strategy, as the program was first intended). “We ask: what can we do to get as much trash as possible from households and recycle it? That drives our metrics efforts.” And those efforts, in turn, are producing data that are valuable not only to Wecyclers, but also to multinational corporations and the municipal government of Lagos.

3. Make data collection relevant to everyone engaged in it.
Would-be recyclers have an incentive to give data to Wecyclers because that same data is used to reward the recyclers with points. Similarly, managers of farm-based enterprises have an incentive to participate in Root Capital’s impact studies, because these studies include questions that managers have created and yield data that helps these entrepreneurs make more informed business decisions. “So much of the conversation has been driven by what development professionals see as impact,” said Kasia from Acumen. Lean approaches to data gathering suggest figuring out how to gather data that has relevance to users of services and feeds it directly to those who can make strategic decisions based on that data.  “Lean Data helps Acumen measure the impact of our investments, for sure, but it also gives the companies we lend to valuable consumer insights, so they are incentivized to collect that data.” 

4. Keep impact assessment as simple as possible. 
When creating an approach to measuring impact, Kevin Starr from Mulago advises picking a method that is as simple as possible and iterative. “We’d rather have a simple movie that evolves over time, than an expensive snapshot that is taken once every three years.”  The data that is gathered does not need to be extensive, but should be directly informing learning and changes in operations. “We want to see iteration from one cycle of data gathering to the next,” said Kevin, a message that was echoed by Bilikiss: “you have to listen [to those you serve] and evolve your metrics as needed.” 

Building a Lean Research community
The consistent theme emerging from the panel discussion was that impact data should be gathered often, but in ways that are streamlined, cost-effective, and embedded in existing activities with stakeholders, or designed in a way that incentivizes stakeholders to contribute meaningfully to the effort. Results should be easily digestible and turned into learning that can inform the actions of the enterprise and its stakeholders, including the people from whom the data was gathered. Data should be gathered only for a few high-priority and strategic impact metrics, and those metrics should relate directly to the purpose of the organization and be allowed to evolve over time in response to changing conditions and needs in the environment in which the organization operates. 

These key messages resonate with an initiative that we have launched at D-Lab over the past year to build a community of researchers, practitioners, and donors who are committed to innovating leaner methods of conducting impact research as well as other forms of field research in development contexts. Like lean data and client-centric evaluation, Lean Research is about making the research process more relevant and meaningful and less burdensome and extractive for the people and communities involved.  Launched in August 2014 as a collaboration between researchers at the Tufts University Fletcher School of Law and Diplomacy and MIT D-Lab, the initiative has grown to include close collaborations with the Feinstein International Center at Tufts, Root Capital, Sustainable Food Lab, and the Global Development Lab at USAID. 

We are still in the early stages of developing this approach and very interested in engaging with organizations who are experimenting with leaner approaches to both impact assessment and related types of field research, such as baseline studies and needs assessments. We are particularly interested in identifying and documenting examples of successful applications of lean research and lean impact assessment.  If you are applying Lean Research principles already in your organization’s approach to metrics, have seen good examples of their application by others, or are interested in using this approach in your work, we invite you to share your experiences and connect with us.

Join the conversation on Lean Research and impact assessment here or on twitter at @Lean_Research and learn more about the initiative and approach at d-lab.mit.edu/lean-research.