1. A Neural Relation Extraction Model for Distant Supervision in Counter-Terrorism Scenario
- Author
-
Jiaqi Hou, Xin Li, Zeyu Wei, Rongchen Zhu, Chongqiang Zhu, and Chao Zhang
- Subjects
0301 basic medicine ,General Computer Science ,Intelligence analysis ,Computer science ,relation extraction ,Big data ,Feature extraction ,02 engineering and technology ,computer.software_genre ,Data modeling ,03 medical and health sciences ,Text mining ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,Representation (mathematics) ,business.industry ,Event (computing) ,General Engineering ,BERT entity encoding ,Relationship extraction ,030104 developmental biology ,020201 artificial intelligence & image processing ,distant supervision ,Data mining ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,business ,computer ,lcsh:TK1-9971 ,BERT ,selective attention mechanism - Abstract
Natural language processing (NLP) is the best solution to extensive, unstructured, complex, and diverse network big data for counter-terrorism. Through the text analysis, it is the basis and the most critical step to quickly extract the relationship between the relevant entities pairs in terrorism. Relation extraction lays a foundation for constructing a knowledge graph (KG) of terrorism and provides technical support for intelligence analysis and prediction. This paper takes the distant-supervised relation extraction as the starting point, breaks the limitation of artificial data annotation. Combining the Bidirectional Encoder Representation from Transformers (BERT) pre-training model and the sentence-level attention over multiple instances, we proposed the relation extraction model named BERT-att. Experiments show that our model is more efficient and better than the current leading baseline model over each evaluative metrics. Our model applied to the construction of anti-terrorism knowledge map, it used in regional security risk assessment, terrorist event prediction and other scenarios.
- Published
- 2020