Back to Search
Start Over
HAT4RD: Hierarchical Adversarial Training for Rumor Detection in Social Media.
- Source :
-
Sensors (Basel, Switzerland) [Sensors (Basel)] 2022 Sep 02; Vol. 22 (17). Date of Electronic Publication: 2022 Sep 02. - Publication Year :
- 2022
-
Abstract
- With the development of social media, social communication has changed. While this facilitates people's communication and access to information, it also provides an ideal platform for spreading rumors. In normal or critical situations, rumors can affect people's judgment and even endanger social security. However, natural language is high-dimensional and sparse, and the same rumor may be expressed in hundreds of ways on social media. As such, the robustness and generalization of the current rumor detection model are in question. We proposed a novel h ierarchical a dversarial t raining method for r umor d etection (HAT4RD) on social media. Specifically, HAT4RD is based on gradient ascent by adding adversarial perturbations to the embedding layers of post-level and event-level modules to deceive the detector. At the same time, the detector uses stochastic gradient descent to minimize the adversarial risk to learn a more robust model. In this way, the post-level and event-level sample spaces are enhanced, and we verified the robustness of our model under a variety of adversarial attacks. Moreover, visual experiments indicate that the proposed model drifts into an area with a flat loss landscape, thereby, leading to better generalization. We evaluate our proposed method on three public rumor datasets from two commonly used social platforms (Twitter and Weibo). Our experimental results demonstrate that our model achieved better results compared with the state-of-the-art methods.
- Subjects :
- Communication
Humans
Social Media
Subjects
Details
- Language :
- English
- ISSN :
- 1424-8220
- Volume :
- 22
- Issue :
- 17
- Database :
- MEDLINE
- Journal :
- Sensors (Basel, Switzerland)
- Publication Type :
- Academic Journal
- Accession number :
- 36081111
- Full Text :
- https://doi.org/10.3390/s22176652