Skip to content

Workshops (Week 2)

Two levels of workshop cater for the needs and interest in fundamentals of evaluation and advanced, cutting-edge topics, theoretical issues and practical application. The workshops have a duration of 2 or 2 ½ days. Participants may attend the whole week or register for single workshops.

Workshop requirements and expectations

The workshops are set for the second week of the On-Site program, comprising two sessions. In each session, there will be six parallel workshops, allowing participants to choose one workshop per session:

  • Session one: July 22 – 24, 2024
  • Session two: July 25 – 26, 2024

The workshop week consist of two levels:

Consolidation level: Solidify and deepen conceptual knowledge and enhance practical skills of evaluation basics as follow-up on the core course. Please see the topics in the blue boxes.

Specialized level: Explore methodological and thematic topics on introductory, intermediate and advanced level including cutting-edge and current development issues. Please see the topics in the yellow and orange boxes.

Please click the boxes below for each workshop’s detailed descriptions, objectives, and requirements; please read carefully to find the best-match workshops.

It will not be possible to change the workshop selection during the program!

Certificate

Participants will be awarded an official IPDET workshop certificate of attendance during graduation night at the end of each week.

Session 1
in Week 2 | July 22 - 24 | 2.5 days long
I-a: Quantitative Data Collection: Planning and Implementing Surveys
How to plan and implement a survey? What are the challenges and pitfalls and how can you overcome them? In this workshop, you will enter the world of quantitative data collection. You will learn how to set up a survey, design a questionnaire, and yield valid and reliable results. You will be provided with a basic method knowledge, real-world examples and hands-on practical group works. So, if you plan to conduct a survey anytime soon, be sure to check out this workshop!
Instructor(s)Stefan Silvestrini
Recommended forEvaluators, Commissioners, Policy Makers
LevelIntermediate
I-b: Designing Evaluations
An evaluation is an assessment of a policy, program or project and serves the purposes of learning and accountability. To achieve these purposes, evaluations need to be purposely designed to enhance their potential utility while at the same time ensuring rigor. This workshop will discuss the foundations of evaluation design. Participants will develop a sound understanding of the building blocks of evaluation design on the basis of structured discussions and exercises around real-world evaluation examples.
Instructor(s)Jos Vaessen
Recommended forEvaluators, Commissioners, Management, Practitioners
LevelIntroductory to intermediate
I-c: Quantitative Impact Evaluation
Are experimental methods the gold standard of evaluation? Are there other valid and useful methods to answer questions about impact? What does that mean and why should we care? If you are interested in impact evaluation, but too afraid (of math and statistical formulae) to ask, this is the workshop for you.
Instructor(s)Claudia Maldonado
Recommended forEvaluators, Commissioners, Policy Makers, Practitioners, Activists
LevelIntermediate
I-d: Theory-based Causal Analysis: the Basics of Process Tracing and QCA
This workshop covers the following topics: review of various causal theories and their affiliated theory-based evaluation methods; fundamentals of designing a theory-based case-based causal analysis; applications using within case causal analysis such as Process Tracing; and applications to enhance generalizability of causal claims through cross-case causal analysis (including brief introduction of QCA).
Instructor(s)Estelle Raimondo
Recommended forEvaluators, Researchers
Levelintermediate to advanced
I-e: From Data to Decisions: Integrating Machine Learning in Evaluation
The workshop introduces the fundamentals of integrating big data science and machine learning algorithms into your evaluation approach. You'll learn about Bayesian theory, predictive and prescriptive analytics, and how to address selection and algorithmic bias. You will be guided through an interactive step-by-step process of building evaluation models with primary and secondary datasets using an open-source, no-cost, no-code visual-based analytics platform.
Instructor(s)Peter York
Recommended forEvaluators, Practitioners, Policy Makers
LevelIntroductory
I-f: Evaluation at the Nexus of Environment and Development
For evaluation to remain relevant in today’s world, it must recognize the interconnectedness of the human and natural systems. As evaluators, we must see the larger picture in which the interventions we evaluate operate. We can no longer evaluate projects and programs in isolation, based only on their internal logic, as if they existed in a vacuum.
Instructor(s)Juha Uitto, Geeta Batra, Anupam Anand
Recommended forEvaluators, Commissioners, Parliamentarians, Policy Makers, Practitioners, Activists
LevelIntermediate
Session 2
in Week 2 | July 25 - 26 | 2 days long
II-a: Introduction to Quantitative Data Analysis
This hands-on workshop will allow you to present your evaluation findings in a professional, convincing, and pleasing manner using tools such as descriptive statistics, bivariate analysis, ANOVA, regression analysis and data visualization. This workshop will introduce you to evidence-based decision-making tools allow you to demonstrate whether the program you evaluated works confidently.
Instructor(s)Bassirou Chitou
Recommended forEvaluators, Commissioners, Policy Makers
LevelIntroductory to intermediate
II-b From Insights to Influence – Communication Strategies to Amplify Evaluation Utility
Are you tirelessly crafting evaluation reports, only to see them gather dust without making a real impact? They lack the narrative and creative spark that ignites action and influence. Here, you'll learn to morph your data into compelling stories. We'll equip you with innovative storytelling techniques and communication strategies tailored to various stakeholders. Our goal? To ensure your evaluation findings don't just inform but are utilised to inspire action and meaningful change.
Instructor(s)Ann Murray Brown
Recommended forEvaluators, Management, Policy Makers, Practitioners, Researchers, Commissioners
LevelIntroductory to intermediate
II-c: Developing Monitoring and Evaluation Systems in Organizations
An M&E system within an organization aims at ensuring the quality of the work of the organization and to increase its overall steering capability. The workshop covers the necessary requirements, like M&E policy, a structural anchoring of M&E, a regulatory framework, quality assurance, a budget, qualified personnel, defined mechanisms for stakeholder participation and for the use of evaluation results.
Instructor(s)Reinhard Stockmann, Wolfgang Meyer
Recommended forManagement, Practitioners, Commissioners, Evaluators
LevelIntroductory
II-d: Evaluating Humanitarian Action: Steps, Challenges, and Real-time Learning
What sets apart evaluations of humanitarian action? This workshop offers an overview of the critical steps and challenges in evaluating humanitarian action and how to address them; a deep dive into real-time learning-oriented evaluation, using case studies from Ukraine and elsewhere; practitioner insights into the ethical aspects of humanitarian evaluation; and cross-cutting considerations for evaluations, including accountability to affected persons and localization.
Instructor(s)Dmytro Kondratenko, Hana Abul Husn, Margie Buchanan-Smith
Recommended forEvaluators, Management, Commissioners, Practitioners
LevelIntroductory to intermediate
II-e: Designing and Managing Projects For Results and Evaluability
What if we can build an “evidence eco-system” around our projects and programs to make them more effective, and our evaluations more useful and insightful? Evidence to support evaluations should be identified as early as the project design stage and evolve throughout the project life cycle. In the workshop, participants will better understand how an integrated approach to building an evidence-based and results-orientated design is critical for project success and more effective evaluations.
Instructor(s)Stephen Pirozzi, Kamal Siblini
Recommended forPractitioners, Evaluators, Commissioners
LevelIntroductory to intermediate
II-f: Evaluation of climate change and development
Based on current concepts, frameworks, practical examples and interactive group work, this workshop covers methods and tools for evaluating climate finance and development impact, including evidence syntheses, advanced case studies analysis, vulnerability assessments and geospatial evaluation.
Instructor(s)Sven Harten, Martin Noltze, Alexandra Köngeter
Recommended forEvaluators, Commissioners, Activists
LevelIntermediate