Research is the nucleus of our activities. Our contribution to policy dialogue and stakeholder engagement, and advocacy are informed by the findings from our research. Policy recommendations are not made in a vacuum. The policies that we recommend and advocacy that we supported are informed by the scientific evidence emanating from our research activities. Our areas of competence include:

  • Health economics: Resource allocation in health care; Demand‐side interventions such as Pay‐for‐performance; Universal health insurance; Health expenditure and outcomes; and Economic evaluation including Health Technology Assessment, cost‐effectiveness analysis, budgetary analysis, etc.
  • Mixed methods design impact evaluation: We employ the mixed method approach to undertake impact evaluation. The mixed method involves both the quantitative and qualitative aspects of an impact evaluation. Impact evaluation seeks to answer the questions of ‘what works’, ‘why’ and ‘under what circumstances’ in development programs, policies, or interventions.

The quantitative aspect attempt to answer the question of ‘what works’, using three main methods: Experimental or Randomised Control Trial, Quasi‐experimental, (e.g. Regression Discontinuity Design, Propensity Score Matching), and Non‐experimental methods, (e.g. Instrumental Variable, Panel data methods, Interrupted time‐series, Endogenous Switching Regression, etc.). The qualitative aspect focuses on the ‘why’ and ‘under what circumstances’ questions, using stakeholder engagement techniques including Focus Group Discussion. The mixed method approach allows an in‐depth understanding of the policy impact by exploring the views and perceptions of the key stakeholders including the policy maker, beneficiaries, and non‐beneficiaries.

Thematic areas of our competence in impact evaluation include:

  • Social protection
  • Early childhood education
  • Private sector development
  • Informal sector development
  • Health and Nutrition
  • Water and Sanitation
  • Weather and climatic change adaptation.

We always welcome potential collaboration in these thematic areas

Completed research and methods used

We have completed research in the following areas:
  • Non‐contributory and unconditional cash transfer for the elderly (Old age pensions)
  • Childhood education and labour market outcomes
  • Complementarities in microcredit market
  • Microcredit, household vulnerability and women empowerment
  • Quality of institutions and women empowerment
  • Empirical industrial organisation
  • Population aging and social security

Impact evaluation design

The procedure for impact evaluation design at IEBDEM include:
  • Clarifying programme objectives with client
  • Development of the Theory of Change
  • Determine evaluation design
  • Sampling design
  • Data collection – instruments and pre‐test
  • Timing
  • Ethical considerations
  • Clarify potential issues with client
  • Develop evaluation protocol

For qualitative design:

  • Focus group discussion
  • Participant interview
  • Direct observation

Field survey and Data collection:

Field survey for data collection is an essential part of an impact evaluation design. Our activities include:

  • Questionnaire design and pre‐test
  • Enumerator training – usage of field survey software
  • Field monitoring to ensure compliance with protocol
  • Scoping survey
  • Data entry