Back to Search
Start Over
Detecting and understanding real-world differential performance bugs in machine learning libraries
- Source :
- ISSTA
- Publication Year :
- 2020
- Publisher :
- ACM, 2020.
-
Abstract
- Programming errors that degrade the performance of systems are widespread, yet there is little tool support for analyzing these bugs. We present a method based on differential performance analysis---we find inputs for which the performance varies widely, despite having the same size. To ensure that the differences in the performance are robust (i.e. hold also for large inputs), we compare the performance of not only single inputs, but of classes of inputs, where each class has similar inputs parameterized by their size. Thus, each class is represented by a performance function from the input size to performance. Importantly, we also provide an explanation for why the performance differs in a form that can be readily used to fix a performance bug. The two main phases in our method are discovery with fuzzing and explanation with decision tree classifiers, each of which is supported by clustering. First, we propose an evolutionary fuzzing algorithm to generate inputs. For this fuzzing task, the unique challenge is that we not only need the input class with the worst performance, but rather a set of classes exhibiting differential performance. We use clustering to merge similar input classes which significantly improves the efficiency of our fuzzer. Second, we explain the differential performance in terms of program inputs and internals. We adapt discriminant learning approaches with clustering and decision trees to localize suspicious code regions. We applied our techniques to a set of applications. On a set of micro-benchmarks, we show that our approach outperforms state-of-the-art fuzzers in finding inputs to characterize the differential performance. On a set of case-studies, we discover and explain multiple performance bugs in popular machine learning frameworks. Four of these bugs, reported first in this paper, have since been fixed by the developers.<br />To appear in ISSTA'20, 11 pages, 8 figures
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Computer science
media_common.quotation_subject
Decision tree
Parameterized complexity
02 engineering and technology
Machine learning
computer.software_genre
Machine Learning (cs.LG)
Set (abstract data type)
Computer Science - Software Engineering
020204 information systems
0202 electrical engineering, electronic engineering, information engineering
Code (cryptography)
Cluster analysis
media_common
Class (computer programming)
Computer Science - Performance
D.2.5
business.industry
020207 software engineering
Fuzz testing
Performance (cs.PF)
Software Engineering (cs.SE)
Debugging
Artificial intelligence
business
computer
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the 29th ACM SIGSOFT International Symposium on Software Testing and Analysis
- Accession number :
- edsair.doi.dedup.....175e59d64c3569ce6a9db7108f876366