Back to Search Start Over

Automating assessment of design exams: A case study of novelty evaluation

Authors :
Pradeep Yammiyavar
Debayan Dhar
Nandita Bhanja Chaudhuri
Source :
Expert Systems with Applications. 189:116108
Publication Year :
2022
Publisher :
Elsevier BV, 2022.

Abstract

An inherent criterion of evaluation in Design education is novelty. Novelty is a measure of newness in solutions which is evaluated based on relative comparison with its frame of reference. Evaluating novelty is subjective and generally depends on expert’s referential metrics based on their knowledge and persuasion. Pedagogues compare and contrast solution for cohort of students in mass examination aspiring admission to Design schools. Large number of students participate in mass examinations, and in situations like this, examiners are confronted with multiple challenges in subjective evaluation such as- 1) Errors encountered in evaluation due to stipulated timeline, 2) Errors encountered due to prolonged working hours, 3) Errors encountered due to stress in performing repeated task on a large-scale. Pedagogues remain ever-inquisitive and vigilant about the evaluation process being consistent and accurate due to monotony of repeated task. To mitigate these challenges, a computational model is proposed for automating evaluation of novelty in image-based solutions. This model is developed by mixed-method research, where features for evaluating novelty are investigated by conducting a survey study. Further, these features were utilized to evaluate novelty and generate score for image-based solutions using Computer Vision (CV) and Deep Learning (DL) techniques. The performance metric of the model when measured reveals a negligible difference between scores of experts and scores of proposed model. These comparative analysis of the proposed model with human experts’ confirm the competence of the devised model and would go a long way to establish trust of pedagogues by ensuring reduced error and stress during the evaluation process.

Details

ISSN :
09574174
Volume :
189
Database :
OpenAIRE
Journal :
Expert Systems with Applications
Accession number :
edsair.doi...........c98ae27be3ef421ecb2c90d0120b6843
Full Text :
https://doi.org/10.1016/j.eswa.2021.116108