Back to Search
Start Over
Improving Automatic English Writing Assessment Using Regression Trees and Error-Weighting
- Source :
- IEICE Transactions on Information and Systems. :2281-2290
- Publication Year :
- 2010
- Publisher :
- Institute of Electronics, Information and Communications Engineers (IEICE), 2010.
-
Abstract
- The proposed automated scoring system for English writing tests provides an assessment result including a score and diagnostic feedback to test-takers without human's efforts. The system analyzes an input sentence and detects errors related to spelling, syntax and content similarity. The scoring model has adopted one of the statistical approaches, a regression tree. A scoring model in general calculates a score based on the count and the types of automatically detected errors. Accordingly, a system with higher accuracy in detecting errors raises the accuracy in scoring a test. The accuracy of the system, however, cannot be fully guaranteed for several reasons, such as parsing failure, incompleteness of knowledge bases, and ambiguous nature of natural language. In this paper, we introduce an error-weighting technique, which is similar to term-weighting widely used in information retrieval. The error-weighting technique is applied to judge reliability of the errors detected by the system. The score calculated with the technique is proven to be more accurate than the score without it.
- Subjects :
- Parsing
Syntax (programming languages)
Writing assessment
Computer science
business.industry
Decision tree
computer.software_genre
Machine learning
Spelling
Weighting
Artificial Intelligence
Hardware and Architecture
Computer Vision and Pattern Recognition
Artificial intelligence
Electrical and Electronic Engineering
business
computer
Software
Natural language
Reliability (statistics)
Sentence
Subjects
Details
- ISSN :
- 17451361 and 09168532
- Database :
- OpenAIRE
- Journal :
- IEICE Transactions on Information and Systems
- Accession number :
- edsair.doi...........524282ad1e3c054fe3d2e3e8d49e77ca