1. Improving reproducibility of geospatial conference papers: lessons learned from a first implementation of reproducibility reviews
- Author
-
Alexander Kmoch, Carlos Granell, Daniel Nüst, Frank O. Ostermann, Department of Geo-information Processing, UT-I-ITC-STAMP, and Faculty of Geo-Information Science and Earth Observation
- Subjects
Reproducibility ,Geospatial analysis ,business.industry ,Process (engineering) ,Computer science ,reproducible research ,computer.software_genre ,Data science ,Reproducible research ,Workflow ,Documentation ,Index (publishing) ,open science ,Community awareness ,Open science ,business ,reproducibility ,computer ,Agile software development - Abstract
See RECORDING (starts at 00:30:30). In an attempt to increase the reproducibility of contributions to a long-running and established geospatial conference series, the 23rd AGILE Conference on Geographic Information Science 2020 (https://agile-online.org/conference-2020) for the first time provided guidelines on preparing reproducible papers (Nüst etal., 2020) and appointed a reproducibility committee to evaluate computational workflows of accepted papers ( https://www.agile-giscience-series.net/review_process.html). Here, the committee’s members report on the lessons learned from reviewing 23 accepted full papers and outline future plans for the conference series. In summary, six submissions were partially reproduced by reproducibility reviewers, whose reports are published openly on OSF ( https://osf.io/6k5fh/). These papers are promoted with badges on the proceedings’ website (https://agile-giss.copernicus.org/articles/1/index.html). Compared to previous years’ submissions (cf. Nüst etal. 2018), the guidelines and increased community awareness markedly improved reproducibility. However, the reproduction attempts also revealed problems, most importantly insufficient documentation. This was partly mitigated by the non-blind reproducibility review, conducted after paper acceptance, where interaction between reviewers and authors can provide the input and attention needed to increase reproducibility. However, the reviews also showed that anonymisation and public repositories, when properly documented, can enable a successful reproduction without interaction, as was the case with one manuscript. Individual and organisational challenges due to the COVID-19 pandemic and the conference’s eventual cancellation increased the teething problems. Nevertheless, also under normal circumstances, future iterations will have to reduce the reviewer’s efforts to be sustainable, ideally by more readily executable workflows and a larger reproducibility committee. Furthermore, we discuss changes to the reproducibility review process and their challenges. Reproducibility reports could be made available to “regular” reviewers, or the reports could be considered equally for acceptance/rejection decisions. Insufficient information or invalid arguments for not disclosing material could then lead to a submission being rejected or not being sent out to peer review. Further organisational improvements are a publication of reviewers’ activities in public databases, making the guidelines mandatory, and collecting data on used tools/repositories, spent efforts, and communications. Finally, we summarise the revision of the guidelines, including their new section for reproducibility reviewers, and the status of the initiative “Reproducible Publications at AGILE Conferences” (https://reproducible-agile.github.io/initiative/), which we connect to related undertakings such as CODECHECK (Eglen etal., 2019). The AGILE Conference’s experiences may help other communities to transition towards more open and reproducible research publications.
- Published
- 2020
- Full Text
- View/download PDF