How choosing the right methods can improve the real-world relevance of mental health research
Too often, research produces interventions that are difficult to scale. This is frustrating for researchers, funders, and critically, those poised to benefit from these interventions. So, how can we bridge the implementation gap?
At the heart of this question is the challenge of how to connect research evidence to real world settings – that is, how to produce actionable evidence. We’ve been exploring this in depth through a series of online symposia inspired by The Lancet Psychiatry Commission on Transforming Mental Health Implementation Research.
We’ve already looked at how systems thinking can transform implementation. This time, we explored what we mean by evidence in mental health research.
Key take-aways
- The ‘best available data’ should not be the only evidence that guides decision-making – we need to integrate expertise and judgement, values and preferences and context to make good decisions.
- A hierarchy of evidence is not helpful when it pits one form of evidence against another – it’s only useful in terms of describing the type of evidence that is needed to answer a particular research question. It’s good to be rigorous, but not to be myopic.
- Evidence too often means ‘quantitative data generated by experts’, invalidating important sources (e.g. from people with lived experience) and types of evidence (e.g. systematic qualitative approaches)
In an ideal world, we should start with the research question and then select the right combination of methods. This is easy to say, but harder to do. Our panel had some ideas:
- We need to keep evolving how we give credit to evidence generators – away from traditional incentives and towards the team science behaviours that we know drive impact.
- Funders could take a leading role in dismantling assumptions by requiring different sorts of evidence, developing innovative funding tools that can respond to innovative approaches to evidence generation, and assessing whether the methods proposed are the best approach. There are good examples of doing this for lived experience – and we should build on this.
- Triangulating data takes a lot of time and investment so we should focus efforts on the questions that matter most. Setting research missions that centre lived experience and policy considerations can provide the scaffolding for researchers to collaborate. Is there an opportunity to do more?
- Relationship building matters – and is underpinned by trust and respect. Targeted support for transdisciplinary training from early career stages would help – but this needs to be complemented by a grassroots approach, so do get started on these conversations yourself!
Read more on our LinkedIn page
Watch the recording
Speakers
Related resources