1. Evaluating FAIR maturity through a scalable, automated, community-governed framework
- Author
-
Wilkinson, Mark D., Dumontier, Michel, Sansone, Susanna-Assunta, Bonino da Silva Santos, Luiz Olavo, Prieto, Mario, Batista, Dominique, McQuilton, Peter, Kuhn, Tobias, Rocca-Serra, Philippe, Crosas, Mercѐ, Schultes, Erik, Bioinformatics, Business Web and Media, Network Institute, Intelligent Information Systems, Institute of Data Science, and RS: FSE DACS IDS
- Subjects
Statistics and Probability ,Technology ,010504 meteorology & atmospheric sciences ,Computer science ,Publication characteristics ,Biología ,02 engineering and technology ,Library and Information Sciences ,01 natural sciences ,Article ,Education ,Domain (software engineering) ,03 medical and health sciences ,Resource (project management) ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Digital resources ,Web application ,lcsh:Science ,030304 developmental biology ,0105 earth and related environmental sciences ,0303 health sciences ,business.industry ,Data science ,Maturity (finance) ,Computer Science Applications ,Test (assessment) ,Research data ,Open source ,Scalability ,lcsh:Q ,Statistics, Probability and Uncertainty ,business ,Information Systems - Abstract
Transparent evaluations of FAIRness are increasingly required by a wide range of stakeholders, from scientists to publishers, funding agencies and policy makers. We propose a scalable, automatable framework to evaluate digital resources that encompasses measurable indicators, open source tools, and participation guidelines, which come together to accommodate domain relevant community-defined FAIR assessments. The components of the framework are: (1) Maturity Indicators - community-authored specifications that delimit a specific automatically-measurable FAIR behavior; (2) Compliance Tests - small Web apps that test digital resources against individual Maturity Indicators; and (3) the Evaluator, a Web application that registers, assembles, and applies community-relevant sets of Compliance Tests against a digital resource, and provides a detailed report about what a machine “sees” when it visits that resource. We discuss the technical and social considerations of FAIR assessments, and how this translates to our community-driven infrastructure. We then illustrate how the output of the Evaluator tool can serve as a roadmap to assist data stewards to incrementally and realistically improve the FAIRness of their resources.
- Published
- 2019
- Full Text
- View/download PDF