The evaluation reports should include relevant and comprehensive information structured in a manner that facilitates its use but also provide transparency in terms of the methods used and the evidence obtained to substantiate the conclusions and recommendations.
Evaluation, by definition, answers evaluative questions, that is, questions about ‘quality’ (how good something is) and ‘value’ (taking into account the specific situation such as the resources used to produce the results and the needs it was supposed to address). Evaluative reasoning is required to synthesize dimensions of quality and value to formulate defensible (i.e., well reasoned and well evidenced) answers to the evaluative questions.
The structure of an evaluation report can do a great deal to encourage the succinct reporting of direct answers to evaluative questions, backed up by enough detail about the evaluative reasoning and methodology to allow the reader to follow the logic and clearly see the evidence base.
The following recommendations will help to set clear expectations for evaluation reports that are strong on evaluative reasoning:
A hallmark of great evaluative reasoning is how succinctly and clearly key points can be conveyed without glossing over important details.
[Source: Davidson J. Evaluative reasoning. UNICEF Methodological Brief 4 on Impact Evaluation. Florence: UNICEF]
The following items are potential outputs from this step. Where possible, it might be useful to research other deliverables that have also been shown to be effective.
The IDRC evaluation manager is responsible for:
IDRC staff members, partners, interns, or consultants doing evaluation work for IDRC should use the guideline Formatting Evaluation Reports at IDRC to structure the main evaluation report.
A brief 1-2 page description of the main findings, methodological approach, and recommendations or conclusions of the evaluation.
This should detail the intended user(s) and use(s) of the evaluation process and/or product; what led to the evaluation (e.g. need, purpose, etc.); the specific evaluation questions or issues addressed; the values and principles guiding the evaluation process; and, any capacity building intentions.
This should include an analysis of the strengths and weaknesses of the research design, tools and methods used, the process followed, data sources, and people interviewed. It should describe how the project/program stakeholders and the intended user(s) of the evaluation participated in the process. It should also comment on the validity of the evidence and any ethical considerations.
This section should be formulated according to the evaluation plan and the terms of reference (TOR) of the evaluation study.
The IDRC guideline on data visualization (PDF, 774KB) provides useful tips for making data easier to understand and use.
The quality of the evaluation report is judged by IDRC’s Evaluation Unit on four internationally recognized standards: utility, feasibility, accuracy, and propriety. A copy of “Quality Assessment of IDRC Evaluation Reports” should be given to the evaluator(s) to ensure they understand how the quality of the evaluation report will be assessed.
Identify the primary intended stakeholders and determine their reporting needs, including their decision-making timelines. Develop a communication plan.