In the search for new, more rigorous and more appropriate methods for development evaluation, one key task is to understand the strengths and weaknesses of a broad range of different methods. This report makes a contribution in this sense by focusing on the potential and pitfalls of Qualitative Comparative Analysis (QCA).
The report aims at constituting a self-contained 8-step how to-guide to QCA, built on real-world cases. It also discusses issues of relevance for commissioners of evaluations using QCA, in particular on how to quality-assure such evaluations.
The report was presented during the seminar Impact Evaluation using Qualitative Comparative Analysis (QCA)
- QCA can drastically shorten the distance between qualitative and quantitative methods. It can be used to analyse both small sets of data (as small as 3 cases) as well as larger sets of cases.
- QCA is a useful tool for theory development and can be relatively cheap since it is often used to make the best of existing resources and data.
- QCA has the possibility to synthesise case-based findings, and assess the extent to which findings can be generalised.
- QCA allow an understanding of what works best for different groups, under different circumstances, and in different contexts.
- QCA is ideally suited to capture causal asymmetry: causal factors that are – although possibly strongly and consistently associated with an outcome – only necessary but not sufficient for it, or only sufficient but not necessary.
- A Quality Assurance checklist is recommended to ensure that the opportunities offered by the method are caught and the pitfalls are avoided.
- The report draws attention to several pitfalls, challenges and limitations: for example, the need for consistently available data across comparable cases; the need for technical skills in the evaluation team; the relative unpredictability of the number of iterations needed to achieve meaningful findings; and finally the need for sense-making of the synthesis output, which can be accomplished in many ways, including drawing on other evaluation approaches like Contribution Analysis, Realist Evaluation and Process Tracing.
Barbara Bafani, Researcher/Consultant