1. An evaluation of the preprints produced at the beginning of the 2022 mpox public health emergency
- Author
-
Melanie Sterian, Anmol Samra, Kusala Pussegoda, Tricia Corrin, Mavra Qamar, Austyn Baumeister, Izza Israr, and Lisa Waddell
- Subjects
Mpox ,Monkeypox ,Preprint ,Peer-review ,Quality ,Risk of bias ,General Works - Abstract
Abstract Background Preprints are scientific articles that have not undergone the peer-review process. They allow the latest evidence to be rapidly shared, however it is unclear whether they can be confidently used for decision-making during a public health emergency. This study aimed to compare the data and quality of preprints released during the first four months of the 2022 mpox outbreak to their published versions. Methods Eligible preprints (n = 76) posted between May to August 2022 were identified through an established mpox literature database and followed to July 2024 for changes in publication status. Quality of preprints and published studies was assessed by two independent reviewers to evaluate changes in quality, using validated tools that were available for the study design (n = 33). Tools included the Newcastle-Ottawa Scale; Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2); and JBI Critical Appraisal Checklists. The questions in each tool led to an overall quality assessment of high quality (no concerns with study design, conduct, and/or analysis), moderate quality (minor concerns) or low quality (several concerns). Changes in data (e.g. methods, outcomes, results) for preprint-published pairs (n = 60) were assessed by one reviewer and verified by a second. Results Preprints and published versions that could be evaluated for quality (n = 25 pairs) were mostly assessed as low quality. Minimal to no change in quality from preprint to published was identified: all observational studies (10/10), most case series (6/7) and all surveillance data analyses (3/3) had no change in overall quality, while some diagnostic test accuracy studies (3/5) improved or worsened their quality assessment scores. Among all pairs (n = 60), outcomes were often added in the published version (58%) and less commonly removed (18%). Numerical results changed from preprint to published in 53% of studies, however most of these studies (22/32) had changes that were minor and did not impact main conclusions of the study. Conclusions This study suggests the minimal changes in quality, results and main conclusions from preprint to published versions supports the use of preprints, and the use of the same critical evaluation tools on preprints as applied to published studies, in decision-making during a public health emergency.
- Published
- 2024
- Full Text
- View/download PDF