Want to make grants evaluation easier?
I talk with a lot of grantmakers and evaluation is one of the most pressing issues.
I'm often asked how to put a better evaluation program in place when one is short on time. How to align the pieces together? How to support community groups so that they can actually do proper evaluation? How do you make evaluation less burdensome and more effective?
Evaluation is a dynamic improvement process. That means it may take a couple of cycles to get the full framework into place.
It is also important to remember the old saying: "horses for courses". A $500 grant gets less evaluation than a $20,000 grant.
Finally, think about how the information gleaned in the evaluation will be used. It not only measures the impact your program is having in the community, it helps you identify and manage risks, and design better programs into the future.
With so much at stake, the secret to creating a good evaluation process is striking a balance. Too little evaluation and you won't know the impact of your programs. Too much can become overly cumbersome.
While it's true that evaluation is not easy, you can make it simpler. Here are seven key steps of to keep in mind:
Step 1: Review your process and make a plan
It all starts with planning. Knowing your purpose allows you to understand what you're trying to achieve in terms of concrete goals. With those goals in mind, you select the best grantees, who go on to execute the most successful projects.
If evaluation is tacked on as an afterthought, than you run the risk of losing sight of your original purpose - so build in your evaluation framework upfront.
Ask yourself - "What is the purpose of your program?"
Is it aligned with your organizational objectives? What does success look like? Knowing what you're hoping to see helps you set the framework for the rest of the program.
Step 2: Clarify your goals
Clarifying your goals is more than just writing a few things down on paper. Work out a clear vision: What are the effective interventions that have real potential to make a difference?
You might want to consider using a standardised classification system such as those provided by the Foundation Center and the Australian Institute of Grants Management (AIGM). Once you've got goals, objectives and measures down, you can build a strategic framework that drives your program forward.
Step 3: Getting your application right
Make sure you give priority to these questions up front so that you will get the answers you want (or need) later on.
Get your grant applicants thinking about evaluation. Prompt them with the question even if they call about another matter. Remind them that the program is looking for results.
It may be obvious, but you need to ask the right questions in order to get the right answers.
Step 4: Assessing, selecting and managing the program
On the surface that sounds quite simple, but you need to be pragmatic.
Treat the different categories differently based on the dollars allocated, the risk of successful evaluation and the significance of the project benefits. Once you do that, it gives you a better chance of actually collecting the data effectively.
Clarity about what you need to collect allows you to guide your grantees as to which evaluation tools to use - observations, surveys, interviews, focus groups etc.
A good time to do this is when you award the contracts. This is the time to set expectations and build in the evaluation process in order to build skills and capacity.
Step 5: Making acquittals work for you
The big burning questions with acquittal are:
You only know if you ask.
Ask what activities were completed, whether there were any project changes, what the actual outcomes were, what their levels of success were and what lessons were learned.
Most importantly, you need to capture the data in an organised way. That means you have to be assessing each acquittal in the same systematic way that you assessed the application forms.
Step 6: Find out if the program made a difference
The only way you can have a chance of answering that question is by looking at each project and working upwards.
Identify the results each project made and then compile and analyse that information to see the overall impact of your program.
Consider aspects like:
Step 7: - Learn, tell, review and build into the next cycle
Finally, you learn, you tell and you review because you may not have gotten everything right the first time.
Take the lessons learned and build them into the next round. Make adjustments each time you start a new program. That way you're constantly improving your organisation, your programs and the impact you have the community.
Kate Caldecott is a self-professed, "grants geek" and loves the good that grants can do. She helps grantmakers solve problems/ issues with their grant programs so that they run more smoothly and effectively.
Kate is the former Executive Director, Australian Institute of Grants Management (AIGM) and was part of the founding team for SmartyGrants, Australia's most widely used grants management system.
She and Social Compass' Dr John Prince, will be running an AIGM evaluation workshop in Sydney in October. They will work with grantmakers on designing their evaluation frameworks and working their evaluation problems based upon the above seven steps.
More information on the workshop can be found here at the AIGM website.
Leading Australian philanthropist Alan Schwartz is tackling one of the hardest challenges the planet faces: to put a true value on the social and natural capital of the world, including health, literacy, trust, clean water and biodiversity.
An abridged version of Gary Banks' address for the Alf Rattigan Lecture for the Australia and New Zealand School of Government (ANZSOG) that points the way for what's worked in the past, and what can be done to avoid policy on the run.
Leading social impact thinker Ross Wyatt says many funders and grantseekers are trapped by evaluations aiming to prove what they did was right. Here's how to do better.
Our Community's Chaos Controller and executive director Kathy Richardson examines how we might create a sector where there are incentives for using evidence.