It’s the beginning of a new year and you may be wondering how you can afford to conduct that important evaluation of your program, initiative or activity. You won’t be alone. But if you’ve ever felt you shouldn’t proceeded with commissioning an evaluation because of budget or resources constraints, think again. I’ve worked with many clients in this situation, drawing on practices outlined by respected American evaluator Michael Bamberger. He calls this a Shoestring approach to evaluation.
Bamberger (2004) identified three common constraints (time, budget or data quality) when commissioning an evaluation, and ways you may be able to overcome them without compromising on quality or validity.
1. Time constraints
Problem: Your program has been running for quite a while. You’re not sure you’ll be able to factor in evaluation in at this late stage, thinking it may be too late to catch up on early activities.
Solution: Talk to your evaluator about simplifying your evaluation design. You can use an alternative designs without compromising the quality of the evaluation findings if your evaluator approaches the evaluation with clarity and simplicity. Consider reducing your sample size. If necessary, reduce the number of surveys or interviews. This may result in a lower level of precision, but it may not compromise the evaluation findings. Consider using alternative sampling approaches such as stratified sample designs or cluster sampling.
2. Budget constraints
Problem: You don’t have the budget to conduct the evaluation as extensively and thoroughly as you’d hoped, particularly when it comes to data collection.
Solution: Talk to your evaluator and have them consider any of the following: Reduce the cost of data collection. Have your evaluator spend considerable time up front designing a detailed evaluation framework and how-to manual to enable you to conduct the data collection yourself. Use self administered questionnaires. Reduce the length and complexity of surveys. Data may be collected by volunteers within the organisation (if fully trained and briefed), university students, or local community residents. Direct observation may be able to replace the need for larger sample sizes. Work with key informants such as well connected community members or key stakeholders. Use multi-method approaches so that independent estimates of key variables may make it possible to reduce sample sizes, while at the same time increasing reliability and validity. If practicable, consider consider conducting focus groups rather than in-depth interviews.
3. Data quality constraints
Problem: When the evaluation does not start until late in the project cycle, there is usually little or no available baseline data available on the condition of the community before the start of the initiative. Project records may be available, but they may be incomplete or may have been poorly recorded or kept. This makes it difficult for the evaluator to explore what life for the communities of interest were like before the program.
Solution. Talk with your evaluator early in the process so they can ascertain the type of data you will need. This sort of discussion can reduce the need for the collection of data that is not absolutely essential for the evaluation. Use key informants. If no baseline data is available, it may be possible to source equivalent information through interviews with key informants such as stakeholders or well informed members of the community. These individuals would ideally have a personal history with the community or the project and may be able to provide reliable anecdotal information.
Ambitions of a new year of evaluation activity doesn’t have to be hampered by budget constraints. Of course, you do need an evaluation budget. But it can be possible to commission a useful and robust evaluation on a modest budget. Adopting a shoestring approach has the added benefit of forcing you to focus on what really matters; asking only the questions that you really need answers to. This is good evaluation practice even without any constraints!
Bamberger, M. (2004). Shoestring evaluation: Designing impact evaluations under budget, time and data constraints. American Journal of Evaluation, 25(1), 5–37.
Image: Cute Laces 2023