In order to design and implement an evaluation, knowledge of the subject matter is important as well as a solid methodology. However, training opportunities in evaluation methodology are scarce and often concern only a few of the many aspects involved. Therefore, Especs is developing training modules which, together, will form a comprehensive and thorough basis for the design and implementation of evaluations.
Characteristics of Especs training in evaluation
Our course programme on evaluation techniques and skills consists of modules. Some of these are being developed in cooperation with other institutes. The main contents of these modules and the intended trainees are presented below. The findings of an evaluation should meet RRV criteria (Relevant, Reliable, Valid). Meeting these criteria is or should be the aim of any serious training effort on evaluation techniques. In addition the Especs approach is characterised by our efforts to:
- maintain a practical orientation
- identify and include the European dimension and priorities
- emphasise consultative processes
- adapt the training for the specific needs of certain clients
- pay attention to political, institutional and other aspects of evaluation that are quite important yet often not really understood, for example when evaluating sector wide approaches or PRSPs.
Students in social research often learn much on how to make a research-design but little on Terms of Reference (ToR). Yet, it is on the basis of ToR that most evaluations are commissioned. Similarly, many scientists know much about the technicalities of their specialisation but very little about teambuilding and working within multi disciplinary teams, budgeting, how to involve stakeholders, etc. However, such practical aspects often have a decisive influence on the evaluation process and outcome and, not to forget, its cost. It is not well possible to indicate here in detail how this practical approach is integrated in the modules as it pervades in fact all modules: from the ToR as mentioned, sampling, the selection of members of the evaluation team, on how (and how not) to identify and involve stakeholders, working with indicators, reporting, etc.
European policies and priorities
Especs explicitly includes the priorities and policies of the EC, and the wealth of experiences of European bilateral, NGO and the other major development agencies, which have long and unique historical ties with the developing world. This inclusion is even more relevant with the paradigm shift from project to program approach, and to donors-consortia supported Poverty Reduction Strategies and Sector-wide Approaches. We propose an evaluation methodology that ensures that the different policy orientations become integral ? not subsidiary - elements in a multi-sector, multi-donors program or PRS evaluation exercise. Our methodology and approach aims at
- clarifying, acknowledging and integrating different policies and capitalizing on the respective strengths and comparative advantages of the various donors and stakeholders, to arrive at a more balanced evaluation exercise, more effective aid, and stronger ownership by all stakeholders. Integration - not uniformity - of various evaluation prioritiesis to draw on Europe?s diversity of policies and experiences.
- practical and process-related suggestions, to ensure a that European stakeholders act more effectively as members in a mixed-loans/grants multi-donors-consortium evaluation-team. We expect to to contribute to a Euro/American cross-fertilisation process through our network of evaluation stakeholders, with useful, practical and field tested evaluation ideas and approaches.
Emphasising consultative processes
Although the involvement of important stakeholders in projects and programmes is widely exclaimed as being important, in many evaluations such involvement is, in fact, a pro-forma exercise and very little time is made available for such consultations.
- Our first contribution is to equip evaluators with the understanding and the skills to assess the quality and effects of (the absence of) processes of inclusion and consultation.
- In addition, to make sure that conclusions and recommendations of an evaluation exercise are owned by the stakeholders, consultation and active participation during an evaluation exercise is necessary. In order to realise this kind of involvement it is not sufficient to mention it in the ToR: the design and organisation of the evaluation should be such that the evaluation exercise itself has a high content of participation, self-evaluation, and feed-back.
Adapt the training to specific needs of clients
There is a widespread need to strengthen the capacity of development organisations to prepare and conduct evaluation by themselves. Especially when it concerns organisations in developing countries. For such capacity strengthening, a training can be devised by combining those modules most relevant for these organisations and adapting course materials, including case studies, as required. The build-up of evaluation capacity on a national level, or the emergence of Country-Led Evaluations (CLE) will also be given due attentention in the Especs modules. It may be useful to combine such training with a properly commissioned and on-going evaluation exercise, so that participants have a direct opportunity to practise what has been trained (learning by doing).
Political, institutional and other elements in the evaluation process
Evaluation is, by definition, not a value-free undertaking. However, it is important to avail over insight in all major factors affecting evaluation processes in order to be able to design and organise evaluation in accordance with the expectations as formulated by the major stakeholders. For example, when analysing the procedures, processes and outcomes of PRSPs (Poverty Reduction Strategy Programmes) as these are stimulated by the World Bank, both the voices of the intended beneficiaries and of bilateral co-financing organisations, are often found to have little effect on the course of events as both the design of the PRSP itself and the underlying bureaucratic, financial procedures and the political and iinstitutional factors, make it difficult to contribute more than just in cosmetic terms. Insight in these contextual factors should assist evaluators in designing, organising or contributing to evaluations in such a way, that their expectations and contributions are more realistic. These insights come to the fore in a number of modules. However, we also are developing a special module on sector wide approach and PRSP evaluation.