Skip to content

II-b: Designing Evaluations

Part of
Session Two
July 21 - 22
Recommended for
Comissioners, Evaluators, Management, Practitioners


This workshop will take participants through the journey of designing an evaluation. This journey starts with building solid understanding of the intervention being evaluated through identifying a strong program by gathering evidence that supports the intervention to be used through developing a logic model which clearly communicates the central model of the intervention. Then the participants will go through comperhending the implementation process of the intervention to ensure fidelity to the central program model, evaluate program quality and efficiency, and establish continuous process improvement protocols. The last phase of the journey will allow participants to identify results; outputs and outcomes through obtaining evidence of positive intervention outcomes by examining the linkages between intervention activities and outcomes. The highest level of evidence allows an intervention to make the claim of being evidence-based by attaining strong evidence of positive intervention outcomes. Ultimately, this journey will enable participants to validate whether the intervention have established the causal linkage between program activities and intended outcomes/impacts or not.
The workshop will utilize and employ the theroetical background the participants have on evaluation i.e. planining evaluation, commissioning evaluation and particpating in evaluation. The workshop will provide participants with a real learning opportunity through being exposed to real case evaluations that were conducted by real organizations for different types of development interventions. Moreover, the workshop will introduce different evaluation methodologies adopted to evaluate a wide range of interventions.


The workshop aims at providing participants with hands on training on how to design evaluation for different types of interventions; project, program and policy. It will present an overview of the basic steps in designing evaluation and will enable participants to discuss evaluation plans and how the intervention evaluation steps apply to their own projects/ programs. The Workshop will include specific information and guidance about the evaluation design and methodology, including individual data collection techniques and tools, sampling strategy, data analysis and triangulation.

Particpants will be able to:

  • Define basic steps for conducting an evaluation and articulate Logical framework, and the Theory of Change
  • Plan for an evaluation and identify the scope of the evaluation and the evaluability of intereventions.
  • Identify the key components of an evaluation plan, and key research questions (main evaluation questions and sub questions)
  • Build the theoretical approach, and select the evaluation design and collecting and analyzing data.
  • Prepare the Evaluation Design Matrix.
  • Identify most relevant startegies to report and communicate findings.
  • Utilize evaluation findings and incorporating lessons learned for improvement and future planning.

recommended for

This workshop is designed for evaluators, evaluation commissioners, program managers, donors, government officials and others who have limited experience in performing practical ex-post evaluations, participated in the core course or similar training. It is of interest to professionals in national and international organizations, public sectors, and development organizations who are responsible for implementing development interventions or have the mandate to evaluate such interventions.


This workshop suits beginners and intermediaries who have theoretical background and knowledge in evaluation and were part of a process of planning for evaluations as well as commissioning and managing evaluations.


Knowledge of the basics of evaluation (e.g. as provided by the IPDET core course or similar training) is highly desirable.