Back to Search Start Over

Computing $k$-means in mixed precision

Authors :
Carson, Erin
Chen, Xinye
Liu, Xiaobo
Publication Year :
2024

Abstract

The $k$-means algorithm is one of the most popular and critical techniques in data mining and machine learning, and it has achieved significant success in numerous science and engineering domains. Computing $k$-means to a global optimum is NP-hard in Euclidean space, yet there are a variety of efficient heuristic algorithms, such as Lloyd's algorithm, that converge to a local optimum with superpolynomial complexity in the worst case. Motivated by the emergence and prominence of mixed precision capabilities in hardware, a current trend is to develop low and mixed precision variants of algorithms in order to improve the runtime and energy consumption. In this paper we study the numerical stability of Lloyd's $k$-means algorithm, and, in particular, we confirm the stability of the widely used distance computation formula. We propose a mixed-precision framework for $k$-means computation and investigate the effects of low-precision distance computation within the framework. Through extensive simulations on various data clustering and image segmentation tasks, we verify the applicability and robustness of the mixed precision $k$-means method. We find that, in $k$-means computation, normalized data is more tolerant to the reduction of precision in the distance computation, while for nonnormalized data more care is needed in the use of reduced precision, mainly to avoid overflow. Our study demonstrates the potential for the use of mixed precision to accelerate the $k$-means computation and offers some insights into other distance-based machine learning methods.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.12208
Document Type :
Working Paper