Back to Search
Start Over
Subjective vs. Objective Evaluation of Ontological Statements with Crowdsourcing.
- Source :
- Proceedings of the Association for Information Science & Technology; 2015, Vol. 52 Issue 1, p1-4, 4p
- Publication Year :
- 2015
-
Abstract
- In this paper we propose and test a methodology for evaluation of statements of a multi-viewpoint ontology by crowdsourcing. The task for the workers was to assess each of the given statement as true statements, controversial viewpoint statement or error. Typically, in crowdsourcing experiments the workers are asked for their personal opinions on the given subject. However, in our case their ability to objectively assess others' opinions is examined as well. We conducted two large-scale crowdsourcing experiments with about 750 ontological statements originating from diverse single-viewpoint ontologies. Our results show substantially higher accuracy in evaluation for the objective assessment approach compared to the experiment based on personal opinions. [ABSTRACT FROM AUTHOR]
- Subjects :
- ONTOLOGY
SENTIMENT analysis
CROWDSOURCING
SUBJECTIVITY
OBJECTIVITY
Subjects
Details
- Language :
- English
- ISSN :
- 23739231
- Volume :
- 52
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Proceedings of the Association for Information Science & Technology
- Publication Type :
- Conference
- Accession number :
- 115251587
- Full Text :
- https://doi.org/10.1002/pra2.2015.145052010068