Carrots and sticks: Results and evidence artefacts and their effects
We are inviting case studies as evidence about the politics of evidence for sharing at next April’s conference Please send us your experiences. If you prefer your name not to be published, we will help you edit the story to protect the anonymity of people and organisations. Go to end of this post for details.
The discourses of ‘results’ and ‘evidence’ are becoming increasingly prominent in the planning,, appraisal, implementation and monitoring and evaluation of development projects and programmes ‘Evidence’ is about proof of ‘what works’ (or doesn’t work) in tackling a problem. Evidence leads to action – intervention or treatment of the problem – that delivers ‘results’, reported upon and possibly also evaluated. The two discourses share a particular understanding of causality, efficiency and accountability, originating in and still more prevalent in countries with an anglo-saxon empiricist tradition. However, through the dominance of English-language based global institutions such as the World Bank, the terminology and related practices are spreading widely within the international development sector and into external aid receiving countries.
Results and evidence discourses shape our working practices through what we are calling ‘artefacts’. Artefacts are the protocols and procedures, created to implement the ideas embedded in ‘results’ and ‘evidence’ discourses. A well-known artefact is logical framework analysis. Artefacts become powerful when used as incentives (carrots) and mandatory requirements (sticks). Just as the boundary between the results and evidence discourses is fuzzy and shifting, so the use of these artefacts can change in relation to time and context.By and large, however, results artefacts are used today to plan, implement, monitor and report for accountability purposes e.g.:
- Results reports
- Performance measurement indicators
- Logical framework analysis
- Theory of Change
- Base-line data
- Progress reviews.
Evidence artefacts are used to choose what to do in relation to ‘best practice’, to make value for money decisions and to evaluate impact e.g.
- Randomized control trials
- Systematic reviews
- Cost-effectiveness analysis
- Option appraisal
- Social return on investment
- Business case
- Impact evaluation.
We are often so accustomed to using one or more of these artefacts that no external control is required to ensure our compliance. It is also common for an organisation to voluntarily adopt one of these artefacts in the absence of any mandatory donor requirement. Or even when their use is mandatory, a grant receiving organisation may be more demanding and controlling in how they are used than may had ever been envisaged by the people who conceived of them. For example, some would argue that the Logframe was meant to stimulate thinking when today it is mainly used as a strict accountability framework.
Thus artefacts can take on a life of their own, independent of the authority that initially required their use. Whether we find an artefact useful in our endeavours will influence how we feel about it. Our personality, experience and kind of job – including for example if we are independent consultants working for different organisations – may also influence our response. But the emotional and power effects of such artefacts also depend on the kind of organisation we work for, not only its position in the aid chain but also on its institutional culture and leadership. For example, different recipients of a similar grant from the same funding agency may differ widely in their attitudes and response to identical mandatory requirements.
Results and evidence artefacts influence our work experience in the international development sector: what we are trying to achieve, how we spend our time, how we relate with our colleagues, partners, grantees, donors and the wider public. What has been your experience? Please send us your case studies of the effects of any one of the artefacts listed above – or of another results or evidence artefact of which you have personal experience).
We are looking for…
- A case of a specific artefact’s effects in practice. (Note that we are not looking for a commentary on an artefact’s theoretical strengths and limitations but rather for stories of how you have observed the effects.)
- Between 400-700 words would be good.
- Firsthand experience (not hearsay) and from the international development sector.
We need your email address but won’t share that with anyone else if you tell us you want to stay anonymous. If anonymity is needed, please give the organisations and individuals described in your story pseudonyms and change some of the context if necessary. Consult us if you want help in doing this.