ODI Logo ODI

Trending

Our Programmes

Search

Newsletter

Sign up to our newsletter.

Follow ODI

What we’ve learnt doing mixed-methods behavioural research in Uganda

Written by Samuel Sharp

Explainer

Understanding post-conflict settings often requires a mixed-methods approach to research.

Over the last few months I’ve been part of a project in Northern Uganda with the Secure Livelihoods Research Consortium (SLRC), Busara Center for Behavioural Economics and Narrate. In a unique research design this combines qualitative, quantitative and experimental methods.

All to explore questions of behaviour: how do past experiences of conflict affect behaviour in peace-time? And are these behaviours likely to aid or hinder post-conflict recovery?

Our research uses four different methodologies:

  1. Our team asked 700 people to narrate a story of importance to them. For half, about the past conflict in Northern Uganda; for the other half, about something that happened recently. They then answer a series of questions exploring how they make sense of and interpret their own stories. This is rich data itself, but the process also serves as a prime for…
  2. Behavioural games. Does recalling conflict make people more altruistic, fair, patient, likely to take risks? Using real money, they played a series of behavioural economics games with options to risk, pool and share cash.
  3. Parallel to this, Mareike and I, along with our Ugandan colleagues, conducted qualitative interviews with everyone from Rastafaris to local music sensations to explore people’s own perceptions, definitions and experiences in supposed ‘post-conflict recovery’.
  4. Finally, a three-wave panel survey set the broader scene and trends in Northern Uganda.

In the last month, our multi-disciplinary team has puzzled over bringing this data together. Here are my reflections on this process of mixed-methods research:

1. Focus, but leave room to explore themes that emerge unexpectedly

The vast amount of data is an abundance of riches, but also a lifetime of work. One of our first challenges was deciding where to look.

A starting point, as standard for experimental research, was our pre-analysis plan. This set out our hypotheses and provided important rigour and confidence in the research.

However, this should not be overly constraining. Themes often emerge unexpectedly in discussion between methodologies and our different findings.

For example, the behavioural games show how people who recall the conflict tend to behave more patiently. In the interviews we regularly heard how life in Northern Uganda is full of waiting – for fulfilment of promises; inclusion in programmes; political change and more. Together we can tell a rich story of not just how conflict affects time preferences, but also what the real-life experience of these time preferences is. To focus solely on questions that we had pre-specified would be a limitation.

2. Methods can, and do, talk to each other

Our research talks in different languages – of ‘heterogeneous effects’, ‘self-significations’, ‘principal component analysis’ to name a few – but they do speak to each other.

Every theme that emerged can be interrogated from multiple angles. For every ‘what we see’, there are many data angles from the many methodologies to explore ‘why we (might) see it’.

For example, we can start with a behavioural finding that conflict recall encourages altruism, then interrogate the qualitative data and participants’ own interpretations for insights as to why.

Of course, this only works because we are a team open to multi-disciplinary ways of explaining the world. (And if we weren’t all on the same page at the start of the process, after several days of the seven of us sitting and working together in one room, we certainly were!)

Mixed methods can also help with the constant challenge of translating research findings to operational relevance. Approaching a phenomenon from multiple angles helps avoid misleadingly linear conclusions, highlights connections between issues, and broadens the entry points for policy beyond what you might see if looking at one issue with one lens.

3. It allows scope for self-reflection

We could use one method to critically reflect on others. For example, following a pilot of the behavioural games, Mareike and I held interviews with some of the participants. We discussed how they perceived the unusual appearance of a ‘lab-in-the-field’, offering money to play games. Was the process understandable; was it seen as research or an NGO programme; what are the ethical considerations?

Being able to ask such questions enabled us to modify the research and will be useful in informing future research design and a forthcoming methodology paper.

What comes next

The key test now is in the presentation of the data. The value of a multi-disciplinary approach is obvious in our discussions, but may be harder to put down on paper. Is there a way to be 50% quantitative and 50% qualitative, that doesn’t leave the audience 50% unhappy?

The team certainly felt that incentives are still lacking for academics to use multi-method approaches. How well we navigate this challenge remains to be seen.

This is an exciting research project that is very much ongoing; do watch this space for findings to come. In the meantime, if you’d like to find out more email me at [email protected].