1. Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture
- Author
-
Duan, Shijin, Liu, Yejia, Liu, Gaowen, Kompella, Ramana Rao, Ren, Shaolei, and Xu, Xiaolin
- Subjects
Computer Science - Machine Learning - Abstract
Vector Symbolic Architecture (VSA) is emerging in machine learning due to its efficiency, but they are hindered by issues of hyperdimensionality and accuracy. As a promising mitigation, the Low-Dimensional Computing (LDC) method significantly reduces the vector dimension by ~100 times while maintaining accuracy, by employing a gradient-based optimization. Despite its potential, LDC optimization for VSA is still underexplored. Our investigation into vector updates underscores the importance of stable, adaptive dynamics in LDC training. We also reveal the overlooked yet critical roles of batch normalization (BN) and knowledge distillation (KD) in standard approaches. Besides the accuracy boost, BN does not add computational overhead during inference, and KD significantly enhances inference confidence. Through extensive experiments and ablation studies across multiple benchmarks, we provide a thorough evaluation of our approach and extend the interpretability of binary neural network optimization similar to LDC, previously unaddressed in BNN literature., Comment: 10 pages, 2 figures. Accepted in CPAL 2025
- Published
- 2025