Back to Search
Start Over
Maximum entropy techniques for exploiting syntactic, semantic and collocational dependencies in language modeling
- Source :
- Computer Speech & Language. 14:355-372
- Publication Year :
- 2000
- Publisher :
- Elsevier BV, 2000.
-
Abstract
- A new statistical language model is presented which combines collocational dependencies with two important sources of long-range statistical dependence: the syntactic structure and the topic of a sentence. These dependencies or constraints are integrated using the maximum entropy technique. Substantial improvements are demonstrated over a trigram model in both perplexity and speech recognition accuracy on the Switchboard task. A detailed analysis of the performance of this language model is provided in order to characterize the manner in which it performs better than a standard N -gram model. It is shown that topic dependencies are most useful in predicting words which are semantically related by the subject matter of the conversation. Syntactic dependencies on the other hand are found to be most helpful in positions where the best predictors of the following word are not within N -gram range due to an intervening phrase or clause. It is also shown that these two methods individually enhance an N -gram model in complementary ways and the overall improvement from their combination is nearly additive.
- Subjects :
- Phrase
Perplexity
business.industry
Computer science
Speech recognition
Principle of maximum entropy
computer.software_genre
Theoretical Computer Science
Task (project management)
Human-Computer Interaction
Dependency theory (database theory)
Trigram
Language model
Artificial intelligence
business
computer
Software
Sentence
Natural language processing
Subjects
Details
- ISSN :
- 08852308
- Volume :
- 14
- Database :
- OpenAIRE
- Journal :
- Computer Speech & Language
- Accession number :
- edsair.doi...........e550368a38749db6b70860a036b02136
- Full Text :
- https://doi.org/10.1006/csla.2000.0149