Print

Print


*apologies for cross posting*

Dear all,

here are collated responses from a query I sent to several mailing lists in August. I requested advice on software to manage and write a review of outcome measures where the searches were likely to retrieve 100 000 references.

14 people responded to my request - thank you!

EndNote
 – no-one reported it wouldn’t cope with 100K refs.
- 1 person had used EndNote X5 for a 90K ref review were able to run duplicate checks (but they were slow). They then split the library into 3 for the screening and marking screened results stage and used EPPI reviewer for analysis
- 1 person reported it worked fine for 100K records as long as PDFs were not attached.
- 1 person (plus myself) have processed a library of 40,000. It was slow but managed it.

EPPI reviewer  www.eppi.ioe.ac.uk<http://www.eppi.ioe.ac.uk/>
-  recommended by 5 saying it does the job. (3 independent, 2 from IOE)
- The support given is v good (from a person who bought the support package)
- recommended as it does the whole process
- one person said de-duplicating was slow
- subscription fee payable
- one person gave detailed feedback from using it for a 90K review:
I think it should have the flexibility to handle the variety of data you're likely to be collecting and is easily used simultaneously by multiple reviewers as is accessed by signing in online. I think the best thing about EPPI reviewer was probably the flexibility of the tables it can generate. So being able to look at the data and present it in different ways is good and relatively straightforward (once you know how!). I think this is definitely one of the challenges when producing a data extraction particularly one with lots of studies and data in it.
However, there are still a number of frustrations with using EPPI reviewer:
- the handbook isn't very user friendly or detailed so really its all about learning it by trial and error (although they have some decent you tube videos)
-setting up the data extraction form can take quite some time and only after you've used EPPI reviewer are you aware of some of the quirks of how it will present the data - this can be really frustrating in a big review and leads to having to make various changes along the way.
-its still very open to human error - if a particularly box or subcategory isn't ticked then the data extracted on that variable won't show up in tables. It gives no warning and its hard to spot when you've made this mistake particularly if you're having to click dozens of boxes. This is particularly problematic if you're trying to make statements like this many studies looked at this behaviour or this many studies used this type of intervention etc. as its hard to be confident the numbers you're getting are accurate.
-also though it claims it has functionality to compare data extraction between two reviewers extracting in duplicate I found this really unhelpful and that it was much easier to compare reviewers data extraction manually
-its quite poor with dealing with studies where there are multiple publications for one dataset - it provides little help in disentangling this
Its hard to come up with a good software package for data extraction - most options are lacking in one way or another. I think I would just about recommend EPPI reviewer - but only because there's nothing massively better available elsewhere.


EROS  http://www.eros-systematic-review.org/index.php?cambiar_idioma=ing
- suggested by 1 person, they didn't have experience of using it – this was the topic of  a workshop at the Oct Cochrane Colloquium (EROS dialogues with RevMan: data extraction, quality assessment and more )

SDSR – systematic review data repository http://srdr.ahrq.gov/ (free)
- recommended by 1. You can create your own abstraction form with an instant validation check but it doesn’t analyse data.
- free software

Distiller-SR http://systematic-review.net/
- recommended by 2, one of which had used it for a review of 170K refs with no problems.
- its the most costly. Multi –reviewer packages it ranges from $329 - $1249 per month. http://systematic-review.net/

SUMARI (System for the Unified Management, Assessment and Review of Information) http://www.joannabriggs.org/SUMARI
- recommended by 1 (from Joanna Briggs) who sent the following info:
SUMARI  is the Joanna Briggs Institute's own software for the systematic review of literature. It is designed to assist researchers and practitioners in fields such as health, social sciences and humanities to appraise and synthesis quantitative and qualitative evidence of feasibility, appropriateness, meaningfulness and effectiveness; and to conduct economic evaluations of activities and interventions. SUMARI encompasses four sub modules;
QARI (Qualitative Assessment and Review Instrument), which is designed to facilitate critical appraisal, data extraction and meta-aggregation of the findings of qualitative studies;
MAStARI (Meta Analysis of Statistics Assessment and Review Instrument), which is designed to conduct the meta-analysis of the results of comparable cohort, time series and descriptive studies using a number of statistical approaches;
ACTUARI (Analysis of Cost, Technology and Utilisation Assessment and Review Instrument), which is is designed to facilitate critical appraisal, data extraction and synthesis of economic data,
and: NOTARI Narrative, Opinion and Text Assessment and Review Instrument), which is designed to facilitate critical appraisal, data extraction and synthesis of expert opinion texts and of reports.
Survey Monkey www.surveymonkey.com<http://www.surveymonkey.com/>
– recommended by 1 for data extraction (However our team ran into problems with this when we were unable to modify the forms later on when we decided to capture data slightly differently).

Mendeley www.mendeley.com<http://www.mendeley.com/>
– recommended by 1 for group work capabailities (though from my experience I doubt this would be able to efficiently import and de-duplicate 100,000 records)

Reference Manager and Microsoft Access in combination
 – used by 1 for large review but not recommended as ‘messy using 2 systems’

There is also a list and brief comparison of systematic review software the book:
Gough, David and Oliver, Sandy and Thomas, James (2012) An introduction to systematic reviews. Sage Publications, London. ISBN 9781849201803.
that also mentions:
ASSERT Automatic Summarisation for Systematic Reviews using Text Mining http://www.nactem.ac.uk/assert/
Comprehensive Meta-Analysis http://www.meta-analysis.com/index.php
MIX 2.0 Leon Bax http://www.meta-analysis-made-easy.com/
RevMan Cochrane Collaboration http://ims.cochrane.org/revman
These are not suitable for our review but I've included them for completeness.

Our team are still seeking funding for this review so we are not yet at point of choosing the software. If we get funding I hope to try out a few alternatives to see which may suit our needs best.
I hope you find this compilation useful.

Best wishes

Judy Wright

Judy Wright
Senior Information Specialist to LIHS and the NIHR Research Design Service Yorkshire & the Humber
Leeds Institute of Health Sciences
University of Leeds
Charles Thackrah Building
Leeds LS2 9LJ
UK
+44 (0)113 343 0876

AUHE Information Specialists http://medhealth.leeds.ac.uk/auhe/is
Research Methods Mini Masterclasses  http://minimasterclasses.wordpress.com/
Health Economics  https://www.facebook.com/HealthEconomicsLeeds/
NIHR Research Design Service http://www.rds-yh.nihr.ac.uk/

To unsubscribe from this mailing list please contact [log in to unmask]