1. Undertaking rapid evaluations during the COVID-19 pandemic: Lessons from evaluating COVID-19 remote home monitoring services in England
- Author
-
Holly Walton, Nadia E. Crellin, Manbinder S. Sidhu, Chris Sherlaw-Johnson, Lauren Herlitz, Ian Litchfield, Theo Georghiou, Sonila M. Tomini, Efthalia Massou, Jo Ellins, Jon Sussex, Naomi J. Fulop, Sherlaw-Johnson, Chris [0000-0002-4851-6060], Fulop, Naomi J [0000-0001-5306-6140], and Apollo - University of Cambridge Repository
- Subjects
rapid evaluation ,mixed methods ,General Social Sciences ,COVID-19 ,key lessons ,reflections - Abstract
Peer reviewed: True, INTRODUCTION: Rapid evaluations can offer evidence on innovations in health and social care that can be used to inform fast-moving policy and practise, and support their scale-up according to previous research. However, there are few comprehensive accounts of how to plan and conduct large-scale rapid evaluations, ensure scientific rigour, and achieve stakeholder engagement within compressed timeframes. METHODS: Using a case study of a national mixed-methods rapid evaluation of COVID-19 remote home monitoring services in England, conducted during the COVID-19 pandemic, this manuscript examines the process of conducting a large-scale rapid evaluation from design to dissemination and impact, and reflects on the key lessons for conducting future large-scale rapid evaluations. In this manuscript, we describe each stage of the rapid evaluation: convening the team (study team and external collaborators), design and planning (scoping, designing protocols, study set up), data collection and analysis, and dissemination. RESULTS: We reflect on why certain decisions were made and highlight facilitators and challenges. The manuscript concludes with 12 key lessons for conducting large-scale mixed-methods rapid evaluations of healthcare services. We propose that rapid study teams need to: (1) find ways of quickly building trust with external stakeholders, including evidence-users; (2) consider the needs of the rapid evaluation and resources needed; (3) use scoping to ensure the study is highly focused; (4) carefully consider what cannot be completed within a designated timeframe; (5) use structured processes to ensure consistency and rigour; (6) be flexible and responsive to changing needs and circumstances; (7) consider the risks associated with new data collection approaches of quantitative data (and their usability); (8) consider whether it is possible to use aggregated quantitative data, and what that would mean when presenting results, (9) consider using structured processes & layered analysis approaches to rapidly synthesise qualitative findings, (10) consider the balance between speed and the size and skills of the team, (11) ensure all team members know roles and responsibilities and can communicate quickly and clearly; and (12) consider how best to share findings, in discussion with evidence-users, for rapid understanding and use. CONCLUSION: These 12 lessons can be used to inform the development and conduct of future rapid evaluations in a range of contexts and settings.
- Published
- 2023