JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for RAMESES Archives


RAMESES Archives

RAMESES Archives


RAMESES@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

RAMESES Home

RAMESES Home

RAMESES  July 2014

RAMESES July 2014

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Re: Systematic re-review

From:

Nick Emmel <[log in to unmask]>

Reply-To:

Realist and Meta-narrative Evidence Synthesis: Evolving Standards" <[log in to unmask]>, Nick Emmel <[log in to unmask]>

Date:

Tue, 1 Jul 2014 17:07:12 +0100

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (137 lines)

Dear All,

This has been a most interesting debate about a rather interesting paper.
My take is that this paper is as realist as any realist review can be at the stage at which it has been done. It takes reports from research that clearly don't have a realist intent, most ignore complexity or treat it as un-complex. Most do not theorise causality, note for instance the authors observation that 'Tonglet et al. are among the few authors who draw out the implications of evidence for alternative impact pathways …' (8). All of the 27 papers included in the review apprehend human agency, but with only 18.5 per cent of the papers presenting sufficient evidence that did not lead the authors to seek out evidence elsewhere. At the other end of the scale 11 (40.7%) of the papers presented no evidence of 'actions by individuals, households or communities [to substantially influence the benefits and harms experienced]' (Table 1. pg. 6). Loevinsohn and colleagues conduct a realist methodology with empirical accounts very nicely in my view, they bring ideas into relation with evidence and through this they produce their eight 'impact pathways'. I'm not keen on their language (it reminds me of the Research Evaluation Framework we have just endured in UK universities), and I agree that allocating elements of these pathways to context, mechanism, and outcome is always going to a fraught (and a pretty unproductive) exercise. Generative mechanisms are, after all, liabilities, dispositions, and powers that fire in particular contexts, to shape regularities and produce particular outcomes. Disentangling one from the rest might be an interesting typological conundrum, but leaves the realist explanation wanting and anaemic. These observations aside, what this paper does rather well is take rather thin accounts of empirical experience, tease out programme theories where they can be found directly and propose them through casing to arrive at statements to be tested as theories of the middle range. In that way Loevinsohn and colleagues build fragile and embryonic theory which, as Janet observes, 'indicates some possible pathways for building middle range theory, and flags up directions for future primary research.' In short, I argue this paper does the purposive work all realist reviews must do. The authors note several times that their impact pathways are 'tentative'. Indeed, they produce impact pathways to be tested with communities, neighbourhoods, and households with poor sanitation and inadequate water supply.
Dr Nick Emmel
Research Director
School of Sociology and Social Policy
University of Leeds
Leeds
LS2 9JT
+44 (0) 113 343 6958

EMMEL ND (2013) SAMPLING AND CHOOSING CASES IN QUALITATIVE RESEARCH: A REALIST APPROACH LONDON SAGE http://goo.gl/YOpct0
________________________________________
From: Realist and Meta-narrative Evidence Synthesis: Evolving Standards [[log in to unmask]] On Behalf Of James Thomas [[log in to unmask]]
Sent: 01 July 2014 13:55
To: [log in to unmask]
Subject: Re: Systematic re-review

Hi Geoff and Hugh,

With regards to the knowledge silo paper, I think the main problem in understanding its main messages lies in the fact that it aims to test the hypothesis as to whether or not a greater practical understanding can be gained when the interventions are examined jointly from health and development perspectives – but then tests this by changing the perspective of those doing the review, the method of doing the review, and the research questions asked. With no proper counterfactual, I agree with Justin, that you can’t disentangle the impact of the change in perspective from the change in questions and method. If they are saying that the change in perspective entails the change in method and research question, then maybe there was no real need to do the empirical part of the paper, as it’s no surprise to find you find new information if you ask a different question. (I think they make a valid point of course – in terms of the dangers of reviews with overly-narrow research questions, methods and theory – but picked a difficult example to demonstrate this, in that Hugh and colleagues did a pretty good job in grounding their review theoretically.)

Geoff, I’m with you on the ‘mechanisms’ – they don’t look like mechanisms to me. (e.g. one mechanism is described as “Staff modify intervention in response to local circumstances”)

Finally, re Hugh’s question about methods, I would be interested to know what the list thinks about using QCA in a realist review? We’ve recently published a paper on this (though not in a realist review), and I wonder whether it would help in terms of organising the synthesis. We have inserted a little about its use in realist review in the discussion, but the paper was already very long, so we couldn’t write more than this. (http://www.systematicreviewsjournal.com/content/3/1/67/abstract) I’ve a feeling Ray Pawson wouldn’t like this amount of formalisation, but do wonder whether it would help in some situations. Hugh, is this the kind of thing you were meaning, or was it more about the warrant and synthesis of summative evidence?

Best wishes, James.



From: Realist and Meta-narrative Evidence Synthesis: Evolving Standards [mailto:[log in to unmask]] On Behalf Of Hugh Waddington
Sent: 01 July 2014 06:23
To: [log in to unmask]
Subject: Re: Systematic re-review

Dear Geoff - you are absolutely right - Howard's 'funnel of attrition' aims to visualise the difference between theory and practice along the full causal chain from programme design and targeting through implementation and outcomes. Thank you for looking at the paper, and pulling out probably the most useful aspect.
I was hoping to stimulate debate on the value of, and methods to evaluate and critically appraise, summative evidence, in the context of RR.
Best wishes
Hugh

On Mon, Jun 30, 2014 at 6:08 PM, Geoff Wong <[log in to unmask]<mailto:[log in to unmask]>> wrote:
This is an interesting discussion a number of fronts
1) The knowledge silo paper
I peer reviewed this manuscript for the journal and mainly focussed on the realist review aspect of the manuscript and not it's content.
Like some of the comments so far wonder if this review could legitmately be considered a realist review.
My main concern was how mechanisms had been conceptualised and the application of a realist logic of analysis.
I'd be interested in hearing others' views on this.
2) "It would be interesting to see realist reviews engaging more thoroughly with summative evidence"
This comment was made by Hugh and I wonder if it needs more unpacking?
It could mean a number of things.
Any thoughts?
3) The FFS funnel
In the report Hugh provided a link to, Figure 13, the FFS funnel looks to me like a great way to show the 'gap' between an intended programme theory and reality on the ground.
Or have I misunderstood the purpose of this diagram?
Geoff





On 27 June 2014 21:11, Jagosh, Justin <[log in to unmask]<mailto:[log in to unmask]>> wrote:
Hi Trish,
Interesting article. I agree with what Sandy and Janet have said.
Although I too agree with the authors on the need for multidisciplinary perspective in analysis, they conflate a few points which muddy their message – I’m talking about the conflation of disciplinary vs. methodological fit for the research question.
What I mean by that is that the authors identify the ‘knowledge silo’ problem as the problem of applying a ‘purely’ health perspective (whatever that may mean) to complex issues of health and hygiene without the development perspective. The assumption being that the health perspective may be blind to some critical factors having to do with ‘agency of beneficiaries’…and other broader contextual features. That is fair enough.
However, if the real issue was lack of interdisciplinary perspective, then they should have conducted a systematic review (meta-regression) using a multidisciplinary team for comparison of interpretation. Instead, they re-reviewed using realist methodology, which created different results. It is bound to do so, regardless of inter-disciplinary lens used in the examination of evidence. I would go so far as to say that using realist review methodology, with the so-called health perspective is going to open up the blinders so that we can begin to see much more in terms of the intervention pathways and all resultant intended, unintended, expected, and sometimes surprising outcomes and impacts. The bigger issue as I see it is RR vs. SR, and the lesser issue is health lens vs. health + development lens. This lack of clarity is a weakness of the article in my opinion.
Another issue is that the authors criticize the Waddington review for their scope – in that the Waddington review limited outcomes to the health question when in fact clean water leads to much more than reduced diarrhoea morbidity. I do not think the authors can object based on the scope, especially given that this was a systematic review and such reviews are designed mainly to study intended intervention outcomes. Pawson’s writing further lends credibility to the idea of defining your scope clearly, and then pulling the pertinent evidence from studies even if there is more that can be said. Again, I feel that this article is more-or-less a criticism of systematic review methods, which is not explicitly stated as such.
Finally, although the authors used realist review as their methodology, I am not convinced that the review embodies the inherent logic of realism – one of the tenets of which is that all knowledge is tentative. In other words, all intervention activity and evidence of effect produced should be seen in terms of theoretical building blocks that need to be re-examined in continual fashion. Context is always changing. The authors emphasize that their findings using CMO configuration is tentative because of the limited dataset used. In what circumstances would such an analysis not be tentative? 100 papers? 1000 papers? In our socialization as researchers, we undermine the power that comes with treating knowledge as tentative. But that’s for another discussion.
 Justin




Justin Jagosh, Ph.D
Senior Research Fellow
Centre for Advancement in Realist Evaluation and Synthesis (CARES)
University of Liverpool, United Kingdom

Phone:
(in Canada)
00-1-604-822-3814<tel:00-1-604-822-3814> (w)
00-1-778-846-4589<tel:00-1-778-846-4589> (m)
________________________________
From: Realist and Meta-narrative Evidence Synthesis: Evolving Standards [[log in to unmask]<mailto:[log in to unmask]>] on behalf of Hugh Waddington [[log in to unmask]<mailto:[log in to unmask]>]
Sent: June 26, 2014 23:09
To: [log in to unmask]<mailto:[log in to unmask]>
Subject: Re: Systematic re-review
Dear Trish and all

I am not going to comment on the re-review here, as I will hope to publish a response in HPP. Our original study (http://www.3ieimpact.org/media/filer_public/2012/05/07/17.pdf) was itself partially a re-review of various meta-analyses. A few points for those who don't have time to read our paper:
- As Sandy mentions, our original review was set up to examine effects of WASH programmes on health outcomes only (child diarrhoea). We did report additional outcomes collected in included papers (in particular time use by women) but the scope was restricted given the breadth of interventions covered. We are currently updating the review for the Campbell Collaboration International Development Coordinating Group, incorporating broader development outcomes.
- We took inspiration from, and referred explicitly to, Pawson and Greenhalgh et al. 2005 and Pawson 2006 on the limitations of 'bare bones' systematic reviews of effects
- The review was not registered with Cochrane, although we did do the meta-analysis. However, the meta-analysis (and policy implications) focus on examining heterogeneity of effects across contexts and time. For example, when you look at sustainability, our results suggest there are impacts of water treatment are no more than water supply, in contrast with previous reviews.
- We used theory of change analysis, examining intermediate (eg access, adherence) and final outcomes, drawing on Rogers' Theory of Diffusion to explain lack of sustained impacts.

The approach of 3ie's Synthesis and Reviews Programme more generally is to integrate the summative analysis of effects with theory-based analysis which engages with the full range of evidence. We have recently produced a review of farmer field schools evidence which does this: http://www.3ieimpact.org/media/filer_public/2014/04/07/srs1_ffs_final_web_1.pdf. It would be interesting to see realist reviews engaging more thoroughly with summative evidence.

Best wishes
Hugh

--
Hugh Waddington
Senior Evaluation Specialist, 3ie
Co-Chair and Editor, Campbell Collaboration International Development Coordinating Group
www.3ieimpact.org<http://www.3ieimpact.org>

ADB-3ie evaluation conference<http://impactevaluation2014.org/>: Call for presentations and workshop proposals; deadline 3 July

New 3ie RFQ<http://www.3ieimpact.org/en/funding/thematic-window/jordan-and-lebanon-humanitarian-assistance-thematic-window/?preview>: Proposal preparation grant for impact evaluation of the cash versus e-vouchers programme in Jordan and Lebanon; deadline27 June.

New 3ie RFQ<http://www.3ieimpact.org/en/funding/thematic-window/humanitarian-assistance-thematic-window/>: Proposal preparation grants for impact evaluation of humanitarian assistance interventions in DRC
New 3ie RFP<http://www.3ieimpact.org/en/funding/thematic-window/thematic-window-hiv/>: Proposals for implementation and impact evaluation of pilot interventions on HIV oral self-tests in Kenya

Download Journal of Development Effectiveness special issue on Systematic Reviews<http://www.tandfonline.com/toc/rjde20/4/.U4chvfkaZXY>.




--
Hugh Waddington
Senior Evaluation Specialist, 3ie
Co-Chair, Campbell Collaboration International Development Coordinating Group
www.3ieimpact.org<http://www.3ieimpact.org>

ADB-3ie evaluation conference<http://impactevaluation2014.org/>: Call for presentations and workshop proposals; deadline 3 July

New 3ie RFQ<http://www.3ieimpact.org/en/funding/thematic-window/jordan-and-lebanon-humanitarian-assistance-thematic-window/?preview>: Proposal preparation grant for impact evaluation of the cash versus e-vouchers programme in Jordan and Lebanon; deadline27 June.

New 3ie RFQ<http://www.3ieimpact.org/en/funding/thematic-window/humanitarian-assistance-thematic-window/>: Proposal preparation grants for impact evaluation of humanitarian assistance interventions in DRC
New 3ie RFP<http://www.3ieimpact.org/en/funding/thematic-window/thematic-window-hiv/>: Proposals for implementation and impact evaluation of pilot interventions on HIV oral self-tests in Kenya

Download Journal of Development Effectiveness special issue on Systematic Reviews<http://www.tandfonline.com/toc/rjde20/4/.U4chvfkaZXY>.

________________________________
The Institute of Education: Number 1 worldwide for Education, 2014 QS World University Rankings

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

May 2024
April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
August 2012
July 2012
June 2012
May 2012
April 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager