Hi Justin
There are so many points of interest in your email, I'm going to choose just one which is: what is the current utility of realist evaluation and review? I'm asking this question because it aims to inform policy, so one place to start is to assess the real-life impact of what's been produced. Where an evaluation or review influenced decisions, we need to ask why - using a transdisciplinary lens. I would argue that projects having the most influence went beyond a transdisciplinary community of researchers and evaluators, to include the disciplines that will be putting the knowledge into practice.
Does anyone have any stories of real- life impact?
Best wishes
Janet
> On 11 Jun 2016, at 17:32, Jagosh, Justin <[log in to unmask]> wrote:
>
> Dear RAMESES forum members:
>
> I am hoping to stimulate a discussion on the forum about people’s experience using realist methodology and suggestions for how advancements in the methodological field should be directed. This work will feed into an ESRC grant proposal I’ll be submitting later this year.
>
> For the grant, I am seeking to assemble an interdisciplinary, international cohort of collaborators – people who are interested in methodological questions and would like to devote some of their time and headspace to advancing complexity-sensitive approaches to methodology. So if you would like to share your thoughts about methodology on the forum – this will be read with interest. If you are interested in dedicating your time more substantively to methodology – specifically to be affiliated with the grant in some capacity, please write to me personally. I’m not sure as of yet what the configuration of collaborators will be, and I can’t guarantee your involvement, but please do introduce yourself, your interests in terms of methodology, yours ideas, experience, expertise, and expectations. The more detail you provide the better but it doesn’t have to polished. Firsthand experience with methodological successes and failures will be read with interest. I will respond but it may take some time. If you are going to be in touch, please do so by the end of June- and send your email to me directly at [log in to unmask] as opposed to answering on the forum.
>
> I would like the grant structure and design to develop through participatory dialogue with interested parties, and there may be various options for involvement in terms of co-investigator, institutional partner, advisory board member, Delphi panel member and so on. I am keen to involve junior researchers and Ph.D students in the grant in some capacity – fresh minds have a good chance at contributing to emerging complexity science and structuring their mental faculties for the demands of pragmatic complexity thinking.
>
> We know that the uptake of realist methodologies is on the rise, in tandem with the development of other kinds of complexity-sensitive approaches. As realist evaluation becomes mainstreamed it is rubbing up against other paradigms and now we are seeing questions arise about whether the boundaries of realist evaluation should be kept clear and protected – so that we can distinguish between genuine (standard) and non-genuine (sub-standard) forms, or whether the logic underpinning realist methodology should be applied to inspire complexity-sensitivity across stages of research including open-system experiments that rely on counter-factual comparisons. It’s a worthy debate.
>
> From my purview, I think that a trans-disciplinary forum on understanding causal logic in research is lacking and needed. Focussed study of causal logic is needed because causal assumptions are at the core of most research methodologies and/or policy, whether it is made explicit or not. Even when a research field or discipline does not engage with causation, the corresponding policy equivalent will. For example, a field such as descriptive epidemiology in which rates and states of morbidity are tracked, measured and published - no causal claims are made – yet causation still exists at a policy level in the subject area because groups of people will be set to the task of designing efforts to change status quo, supported by the findings from descriptive statistics. That change effort will necessarily involve causation – because you can’t design a programme without an implicit or explicit assumption that X will change Y, via Z. So causation is a fundamental element, whether it be it in the research process itself, or in the translation of research to practice.
>
> Studying causation can help to address the differences (and differences of opinion) with regard to the acceptability of methodology for different areas of inquiry. It is likely that there are more than a few people on this forum who would say that RCTs are suitable for drug trials – in which effect size calculations are needed to inform drug policies, but not so suitable for complex social interventions in which social contingency is the linchpin. Currently the MRC guidelines for complex interventions suggests that whenever possible, a randomization is desirable. What is missing in my opinion is a fit-for-purpose analysis in which we collectively agree that certain causal logics are suitable for certain types of research questions and research areas. This is not to create blanket statements, but one cannot disregard difference in nature between, for example, an experiment to test the pharmacological efficacy of a drug, and an intervention to prevent schoolyard bullying.
>
> And between these two hypothetical ends, there is a spectrum of interventions that have, one could argue, degrees of durability of their core objects. What is under-developed is the promotion and adoption of clearly established taxonomies for categorizing interventions that will enable a fit-for-purpose framework leading to the customization of methodologies to areas of inquiry. If people disagree with this I would be keen to hear opposing viewpoints.
>
> Central to advancing the methodological field is dialogue, debates and disputation amongst a trans-disciplinary community of researchers and evaluators and this forum is a good starting point. We are facing incredibly challenging problems that do not simply require more research – multi-morbidity, chronic diseases, health inequalities, climate change adaptation etc. More suitable research, in which the foundations are realistic, pragmatic and complexity-sensitive – and in which the researchers and end-users develop their analytic capacities to deal with such complex problems – this is key I think. So with that I’m opening up to discussion –either online on this forum, or offline, via personal email correspondence. What would you propose for research to advance the methodological field?
>
> Many thanks,
> And looking forward to your responses.
>
> Justin
>
> Justin Jagosh, Ph.D
> Senior Research Fellow
> Director, Centre for Advancement in Realist Evaluation and Synthesis (CARES)
> University of Liverpool, UK
> www.liv.ac.uk/cares
|