Back to Search
Start Over
Multiview learning with twin parametric margin SVM.
- Source :
-
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2024 Aug 07; Vol. 180, pp. 106598. Date of Electronic Publication: 2024 Aug 07. - Publication Year :
- 2024
- Publisher :
- Ahead of Print
-
Abstract
- Multiview learning (MVL) seeks to leverage the benefits of diverse perspectives to complement each other, effectively extracting and utilizing the latent information within the dataset. Several twin support vector machine-based MVL (MvTSVM) models have been introduced and demonstrated outstanding performance in various learning tasks. However, MvTSVM-based models face significant challenges in the form of computational complexity due to four matrix inversions, the need to reformulate optimization problems in order to employ kernel-generated surfaces for handling non-linear cases, and the constraint of uniform noise assumption in the training data. Particularly in cases where the data possesses a heteroscedastic error structure, these challenges become even more pronounced. In view of the aforementioned challenges, we propose multiview twin parametric margin support vector machine (MvTPMSVM). MvTPMSVM constructs parametric margin hyperplanes corresponding to both classes, aiming to regulate and manage the impact of the heteroscedastic noise structure existing within the data. The proposed MvTPMSVM model avoids the explicit computation of matrix inversions in the dual formulation, leading to enhanced computational efficiency. We perform an extensive assessment of the MvTPMSVM model using benchmark datasets such as UCI, KEEL, synthetic, and Animals with Attributes (AwA). Our experimental results, coupled with rigorous statistical analyses, confirm the superior generalization capabilities of the proposed MvTPMSVM model compared to the baseline models. The source code of the proposed MvTPMSVM model is available at https://github.com/mtanveer1/MvTPMSVM.<br />Competing Interests: Declaration of competing interest The authors of this work have no competing interests.<br /> (Copyright © 2024 Elsevier Ltd. All rights reserved.)
Details
- Language :
- English
- ISSN :
- 1879-2782
- Volume :
- 180
- Database :
- MEDLINE
- Journal :
- Neural networks : the official journal of the International Neural Network Society
- Publication Type :
- Academic Journal
- Accession number :
- 39173204
- Full Text :
- https://doi.org/10.1016/j.neunet.2024.106598