If you’ve ever wished you had more resources to conduct an evaluation, you’re not alone. But if you’ve ever felt you shouldn’t proceeded with commissioning an evaluation because of constraints on resources, think again. Renowned American evaluator Michael Bamberger has written extensively about how to conduct evaluations with limited resources. He calls it a Shoestring approach to evaluation.
Bamberger (2004) identified three common constraints when commissioning an evaluation, and ways you may be able to overcome them as a commissioner, so that the evaluation is not comprised in quality, validity or reliability.
1. Time constraints
Time constraints exist when when the evaluator is not called in until the project is well advanced and the evaluation needs to be conducted in a shorter than ideal time period.
Simplify the evaluation design. It may be possible to use less robust designs without compromising the quality of the evaluation findings if your evaluator approaches the evaluation with clarity and simplicity.
Reduce the sample size. If necessary reduce the number of surveys or interviews. This may result in a lower level of precision, but it may not compromise the evaluation findings. Consider using alternative sampling approaches such as stratified sample designs or cluster sampling.
2. Budget constraints
Budget constraints can arise when funds for the evaluation were not included in the original project budget. With a limited budget, it may not be possible to conduct the evaluation as extensively and thoroughly as you’d hoped, particularly when it comes to data collection.
Reduce the cost of data collection. Have your evaluator spend considerable time up front designing a detailed evaluation framework and how-to manual to enable you to conduct the data collection yourself. I am currently working this way with Four Winds in an evaluation of The Bermagui Project, an arts-based program on the far south coast of NSW.
Use self administered questionnaires. Reduce the length and complexity of surveys. Data may be collected by volunteers within the organisation (if fully trained and briefed), university students, or local community residents. Direct observation may be able to replace the need for larger sample sizes. Work with key informants such as well connected community members or key stakeholders. Use multi-method approaches so that independent estimates of key variables may make it possible to reduce sample sizes, while at the same time increasing reliability and validity. If practicable, consider consider conducting focus groups rather than in-depth interviews.
Simplify and speed up data input and analysis. Strategies include entering data directly through notebook computers or other hand held devices, or reorganising project monitoring records and data collection forms so that the information on the forms can be more easily and rapidly analysed.
3. Data quality constraints
When the evaluation does not start until late in the project cycle, there is usually little or no available baseline data available on the condition of the community before the start of the initiative. Project records may be available, but they may be incomplete or may have been poorly recorded or kept. This makes it difficult for the evaluator to explore what life for the communities of interest were like before the program.
Talk with your evaluator early in the process so they can ascertain the type of data you will need. This sort of discussion can reduce the need for the collection of data that is not absolutely essential for the evaluation.
Use key informants. If no baseline data is available, it may be possible to source equivalent information through interviews with key informants such as stakeholders or well informed members of the community. These individuals would ideally have a personal history with the community or the project and may be able to provide reliable anecdotal information.
Even with time, budget or data constraints, it can be possible to commission a useful and robust evaluation without any threats to the validity or reliability of the evaluation findings. Adopting a shoestring approach as outlined by Bamberger (2004) has the added benefit of necessitating focus on what really matters in your inquiry; asking only the questions that you really need answers to and working efficiently. This is good evaluation practice even without any constraints!
Bamberger, M. (2004). Shoestring evaluation: Designing impact evaluations under budget, time and data constraints. American Journal of Evaluation, 25(1), 5–37.