Skip to content

Tentative Workshop List 2021

Session 1
in week 2 | 2.5 days long
I-a: Qualitative Research Methods for Evaluation
What are the ‘rules of the game’ in drawing on qualitative research methods for evaluation? The workshop covers a range of qualitative methods, including in-depth interviews, focus group discussions, participation observation, participatory tools and Photovoice, and discusses how these methods can be used to improve the impact, quality and accountability of development Programmes.
Instructor(s)Morten Skovdal
Recommended forEvaluators, Policy Makers, Practitioners
LevelConsolidation
I-b: Designing and building results-based Monitoring and Evaluation systems
Participants review and apply the key ‘ten steps’ in building results-based M&E systems. Group work involves planning for a readiness assessment, determining indicators and targets, and preparing a results frameworks (based on a theory of change) for management and decision-making. Examples of M&E systems and platforms are shared.
Instructor(s)Ray C. Rist, Maurya West Meiers
Recommended forEvaluators, Management, Policy Makers
LevelSpecialized
I-c: Commissioning Evaluations
Evaluation is positioned in New Public Management, and M&E is compared with other concepts like Audit, Controlling, QM. Participants understand the role of evaluations for decision-making and how to consider power relations with their organization. They are guided through each step of commissioning and managing an evaluation and how to promote its use.
Instructor(s)Reinhard Stockmann, Stefanie Krapp
Recommended forComissioners, Management, Policy Makers
LevelSpecialized
I-d: Gender-responsive Evaluation: Enhancing empowerment and sustainable development
Participants get familiar with constitutive elements of gender-responsive evaluations. Gender-responsive data collection tools and their practical implementation are introduced. Topics include: gender, diversity, intersectionality and empowerment; inclusive and participatory approach; sustainable development, hybrid evaluation models; stakeholder and context analysis; gender indicators and sex-disaggregated data; qualitative participatory data collection methods; gender-responsive reporting.
Instructor(s)Marianne Meier
Recommended forEvaluators, Comissioners, Policy Makers, Parlamentarians
LevelSpecialized-advanced
I-e: Monitoring and evaluating the SDGs: Challenges and proposed solutions
In the light of the Agenda 2030, the development of a coherent M&E System for SDGs, linked to national development priorities and plans, has become a priority. The workshop introduces key elements of evaluating sustainable development, based on (complex) systems and transformative thinking, and how these can be implemented in monitoring and evaluation systems. It offers some first solutions and discusses its practicability in various contexts.
Instructor(s)Wolfgang Meyer
Recommended forEvaluators, Practitioners
LevelSpecialized
I-f: Impact Evaluation in practice with a special focus on the humanitarian sector, environment and climate change
Do you want to know how and when to think about measuring impact credibly in humanitarian/ environment/climate change programs? How can you deal with measurement challenges in impact evaluations? If yes, to get a flavor, come to this hands-on workshop!
Instructor(s)Jyotsna Puri
Recommended forEvaluators
LevelSpecialized
Session 2
in week 2 | 2 days long
II-a: Collecting quantitative Data
Take on a thorough, step-by-step introduction to basic concepts of measurement theory and quantitative survey design. Participants learn concepts related to measurement theory and what data to collect, how to design good survey instruments, why and how to pilot survey instruments, and sampling protocols and strategies, as well as best practices related to each.
Instructor(s)Urvashi Wattal
Recommended forEvaluators, Comissioners, Policy Makers
LevelConsolidation
II-b: Designing Evaluations
Participants walk the basic steps in designing evaluations for different kinds of interventions, and how they apply to their own projects/programs: articulate the logical framework, plan for an evaluation, identify scope, approach, and key questions, select the evaluation design and methodology, prepare the evaluation design matrix, identify most relevant strategies to report and communicate findings, utilize evaluation findings.
Instructor(s)Amjad Hussein Al-Attar
Recommended forEvaluators, Comissioners, Management, Practitioners
LevelConsolidation
II-c: Developing Monitoring and Evaluation Systems in Organizations
An M&E system within an organization aims at ensuring the quality of the work of the organization and to increase its overall steering capability. The workshop covers the necessary requirements, like M&E policy, a structural anchoring of M&E, a regulatory framework, quality assurance, a budget, qualified personnel, defined mechanisms for stakeholder participation and for the use of evaluation results.
Instructor(s)Reinhard Stockmann, Wolfgang Meyer
Recommended forManagement, Practitioners, Comissioners, Evaluators
LevelSpecialized
II-d: Using case-studies and case-based design in Evaluation
Participants understand the value and limitations, types and uses of case studies in evaluation. They know how to design a case-based approach, to apply case study methods, and to judge the quality of a case study. Topics include: choosing the right level of analysis, determining the boundaries of the case, developing robust selection criteria, developing comparative and nested approaches when relevant, elaborating a case protocol, and analyzing case study data.
Instructor(s)Linda Morra Imas
Recommended forEvaluators, Management, Comissioners
LevelSpecialized
II-e: Essentials of Private Sector Development Evaluations and the Case for Impact Investment
Participants learn about the specificity and dynamics of private sector development evaluation, methodological approaches and evaluation practices used by Multilateral Development Banks and Development Finance Institutions, at the institutional and project level. Case studies illustrate the essentials - evaluation of financial intermediaries, evaluating equity investments and impact investing.
Instructor(s)Raghavan Narayanan
Recommended forEvaluators, Comissioners, Management, Policy Makers, Parlamentarians
LevelSpecialized
II-f: 101 on Meta-Evaluation, or: how to evaluate Evaluations – Approaches, Methods and Findings
Participants understand the concept and added value of meta-evaluation for strengthening evaluation systems in implementing organizations. They know how to adequately design and conduct a meta-evaluation and which criteria need to be applied thereby. Topics include: designing a meta-evaluation, relevant assessment criteria, practical implementation of a meta-evaluation.
Instructor(s)Stefan Silvestrini
Recommended forComissioners, Evaluators
LevelSpecialized
Session 3
in week 3 | 2.5 days long
III-a: Introduction to Quantitative Data Analysis
This workshop introduces participants to the basics of descriptive as well as of inferential statistics. The discussion includes measurement scales, descriptive statistics, official statistics, benchmarking, bivariate statistic, contingency tables, ANOVA, linear regressions cluster and factor analysis.
Instructor(s)Bassirou Chitou
Recommended forEvaluators, Comissioners, Policy Makers
LevelConsolidation
III-b: Essentials of Theory-Based Evaluation
Program theories guide the design and implementation of policy interventions and constitute an important basis for evaluation. Participants learn about the essentials of using program theory in evaluation. Topics include: what we mean by program theory; why is it important (in evaluation); what are the sources; how to develop a program theory; how do we use program theory as a framework for evaluation; what are the main challenges.
Instructor(s)Jos Vaessen
Recommended forEvaluators, Comissioners, Policy Makers
LevelConsolidation
III-c: Making Evaluation useful and actually used
Participants learn the factors and processes that enhance evaluation use based on 40 years of research on utilization.
Instructor(s)Michael Patton
Recommended forEvaluators, Management, Policy Makers, Practitioners, Comissioners
LevelConsolidation
III-d: Fundamentals of Rigorous Impact Evaluation: Everything you wanted to know but were afraid to ask
This workshop provides answers to `everything you wanted to know but were afraid to ask’ about rigorous quantitative (experimental and quasi-experimental) impact evaluation: concepts, methods, scope and limitations.
Instructor(s)Claudia Maldonado
Recommended forEvaluators, Comissioners, Policy Makers, Practitioners
LevelSpecialized
III-e: National Evaluation Systems in the public service: Institutionalizing evidence informed policymaking
What is institutionalisation of evaluation in the public service, why it is important in public policy and how to institutionalise evaluation within the public service? The workshop explores evaluation policy, approaches to build capacity to supply and demand evaluations, evaluation standards, and approaches to ensuring use of evaluation. Participants learn from case studies of different countries and explore ways to implement in their own countries.
Instructor(s)Matodzi Michelle Amisi, Ian Goldman
Recommended forEvaluators, Management, Policy Makers, Practitioners
LevelSpecialized
III-f: Emerging technology and its use in Monitoring, Evaluation and Learning
Are you wondering how new and emerging technologies can be applied in your monitoring, evaluation and learning practice? We introduce a framework for evaluating the utility of different types of technology and understand when it might be (and might not be) most effective to use. We review the pros and cons of different types of technologies and the human and financial costs of deploying them.
Instructor(s)Kerry Bruce
Recommended forPractitioners, Evaluators, Comissioners
LevelSpecialized
Session 4
in week 2 | 2 days long
IV-a: Qualitative Data Analysis
Participants learn to organize qualitative data more efficiently for evaluations using computer-aided qualitative data analysis. The basics of qualitative data analysis are introduced using first-cycle and second-cycle coding methods. We use widely available commercial software to organize, code, and analyze data according to evaluation questions. We also cover how to interpret the results and present the findings in ways that are easily understood and visually appealing.
Instructor(s)Kerry Bruce, Arunjana Das
Recommended forComissioners, Evaluators
LevelConsolidation
IV-b: Communication and Reporting – How to increase the impact and utility of your Evaluation
Participants learn about the essential elements that underpin effective evaluation communication, dissemination and reporting, using real examples and case studies. The workshop focuses on strategies and tactics that increase the likelihood that key stakeholders will absorb and act on evaluation findings.
Instructor(s)Daniel Musiitwa
Recommended forEvaluators, Management, Policy Makers, Practitioners
LevelConsolidation
IV-c: Culturally and contextually responsive Evaluation
Participants develop an awareness of how cultural differences may affect the design and the implementation of an evaluation, as well as the dissemination of evaluation results. They learn how to apply context and culturally responsive approaches to the design and evaluation of projects and how to review projects for culturally and contextually responsiveness.
Instructor(s)Bagele Chilisa
Recommended forEvaluators, Comissioners, Parlamentarians, Policy Makers, Practitioners
LevelConsolidation
IV-d: Theory-based causal Analysis: Process Tracing, QCA and Pattern-Matching in practice
This workshop covers the following topics: review of various causal theories and their affiliated theory-based evaluation methods; fundamentals of designing a theory-based causal analysis; applications using three case-based approaches: (1) process tracing; (2) pattern-matching; (3) Qualitative Comparative Analysis.
Instructor(s)Estelle Raimondo
Recommended forEvaluators
LevelSpecialized-advanced
IV-e: Evaluation in the UN
The workshop uses real examples and practical exercises to demonstrate how the largest evaluation office in the UN system approaches evaluation: design and methods, analytical tools, the use of theories of change, dealing with real-world challenges, the SDGs, gender equality and human rights perspectives, the quality of evidence and the practical aspects of presenting findings, conclusions and recommendations.
Instructor(s)Ana Rosa Soares, Richard Jones
Recommended forEvaluators, Practitioners
LevelSpecialized
IV-f: Digital Analytics for Monitoring and Evaluation
How does one make sense of this new digital data, utilizing data science methods while adhering to rigorous evaluation protocols? This workshop provides a thorough grounding in developing a robust M&E approach for digital projects and revising existing workflows in response to the unique digital environment. Participants practice data analysis for large digital datasets and discuss how to incorporate these techniques in programmatic Evaluations.
Instructor(s)Kecia Bertermann, Claudia Abreu Lopes, Calum Handforth
Recommended forEvaluators, Practitioners
LevelSpecialized