NIHR Applied Research Collaboration (ARC) Yorkshire and Humber
https://arc-yh.nihr.ac.uk/
University of Leeds hosted PhD studentship
One studentship is on offer to start in the academic year 2020/21. We welcome discussions with potential PhD Students for either of the following topics:
https://phd.leeds.ac.uk/project/810-the-design-and-analysis-of-randomised-trials-in-implementation-laboratories
https://phd.leeds.ac.uk/project/809-mathematical-models-of-complex-healthcare-implementation-strategies
Proposed PhD projects
1. The design and analysis of randomised trials in implementation laboratories.
Supervisory team: Dr Rebecca Walwyn (Lead, CTRU), Prof Steven Gilmour (KCL), Beatriz Goulao (Aberdeen), Dr Sarah Alderson (LIHS), Prof Amanda Farrin (CTRU)
Researchers are being encouraged to examine how they work to reduce research waste and maximise efficiency. Implementation science studies ways of promoting the systematic uptake of research findings, and other evidence-based practices, into routine clinical practice. Research shows that common implementation strategies, such as audit and feedback, are effective but with substantial unexplained heterogeneity. Yet new trials often do not build on this evidence or address key questions to advance the field, one of which is how to optimise effectiveness. Ivers and Grimshaw have argued for a shift in implementation research design to a new sequential comparative-effectiveness model, embedding programmes of trials within existing, large-scale initiatives, which include National Clinical Audits. They refer to the close collaboration between research teams and the health systems that deliver implementation strategies as "implementation laboratories", a kind of trial platform. Such laboratories create an opportunity to generate real-world evidence about the individual and combined effects of components of implementation strategies, at scale. They also offer a means of quantifying the effect of context on strategy effectiveness and of enabling health systems to iteratively refine their strategies.
Grimshaw et al suggest a series of "head-to-head" parallel-group trials, taking forward the most effective strategy from the previous trial and comparing it to a refined alternative. Other options are available, however. For example, a multiphase optimisation strategy (MOST) [6] could be adopted in which a series of factorial trials are conducted, which retain the effective components from previous trials, continually refine these components and screen promising others. A multi-arm multi-stage (MAMS) approach could be considered, where strategies are added or dropped at successive interim analyses. A challenge is that each trial will often be cluster-randomised, such that interventions (strategies or components of strategies) are allocated to level(s) in a health system (clusters), while outcomes are collected at a patient-level. So each trial, if conducted at scale, would involve some, if not all, of the same clusters. A further challenge arises from performing a combined analysis across trials.
Background of student: A BSc involving Mathematics and/or an MSc in Statistics or a related area, with an interest in medical statistics and/or design of experiments.
2. Mathematical models of complex healthcare implementation strategies.
Supervisory team: Dr Rebecca Walwyn (Lead, CTRU), Dr Benjamin Thorpe (Maths), Dr Fabiana Lorencatto (UCL), Sarah Alderson (LIHS), Professor Amanda Farrin (CTRU)
The Medical Research Council (MRC) provides guidance on how to develop and evaluate complex interventions, such as surgery or psychotherapy, highlighting four features of intervention complexity: i) the number of interacting components, ii) the need to characterise delivery, iii) the degree of tailoring, and iv) the multiple potential levels at which interventions work. Many component combinations are therefore possible, which makes it important to identify the optimal combination to take forward. This is often done theoretically but could be confirmed empirically, building on Design of Experiments (DoE) methods. Empirical optimisation, through screening and refining experiments, allows theoretical (or "logic") models that specify how interventions work to be translated into statistical models, which accurately predict optimal combinations of components under a variety of scenarios. A gap exists, however, between the theoretical model and designing an experiment to optimise the complex intervention. One way of filling this gap would be to translate a logic model into a mathematical model that can be used to make predictions, allowing the logic model to be quantified, elaborated and refined based on theory and existing data prior to it being used to inform the design of an experiment. The MRC guidance highlights the value of modelling complex interventions but others recognise that it could go much further.
Implementation science can be defined as the scientific study of methods to promote the systematic uptake of research findings, and other evidence-based practices, into routine clinical practice, and, hence, to improve the quality and effectiveness of health services. The aim is typically to evaluate an implementation strategy (such as Audit & Feedback), which is typically directed at clinician behaviour and/or organisational change. Implementation labs have recently been proposed as a means of using existing "at scale" service implementation programmes (e.g. National Clinical Audits) to embed sequential experiments that would test different ways of delivering implementation strategies; methodological research to improve the quality of National Clinical Audits is currently a MRC priority. However, despite the most recent Cochrane Review including 140 randomised trials, it remains difficult to recommend one strategy (or combination of components) over another on empirical grounds.
Background of student: A BSc involving Mathematics and/or an MSc in Statistics or a related area, with an interest in medical statistics.
Host University and Department:
University of Leeds, Leeds Institute of Clinical Trials Research (LICTR), Methodology Research Group
Closing date:
Thursday 3rd December 2020
Anticipated start date:
January to February 2021
To explore these opportunities further or for any queries you may have, please contact:
Dr Rebecca Walwyn [[log in to unmask]]
Dr Rebecca Walwyn
Associate Professor of Clinical Trials Methodology
Lead for Student Education
Lead for Methodology Research
Co-Lead for PGR Studies
Clinical Trials Research Unit
Leeds Institute of Clinical Trials Research
University of Leeds
Leeds
LS2 9JT
Tel: 0113 343 5485
Fax: 0113 343 1471
Email: [log in to unmask]<mailto:[log in to unmask]>
Website: www.leeds.ac.uk/LICTR<http://www.leeds.ac.uk/LICTR>
You may leave the list at any time by sending the command
SIGNOFF allstat
to [log in to unmask], leaving the subject line blank.
|