1. Compensating Data Shortages in Manufacturing with Monotonicity Knowledge
- Author
-
Ingo Schmidt, Torsten Kraft, Lukas Morand, Martin von Kurnatowski, Jochen Schmid, Jan Schwientek, Anke Stoll, Patrick Link, Rebekka Zache, and Publica
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,monotonic regression ,Mathematical optimization ,Optimization problem ,shape constraints ,Bending (metalworking) ,Industrial engineering. Management engineering ,Computer science ,As is ,Monotonic function ,T55.4-60.8 ,Machine Learning (cs.LG) ,Theoretical Computer Science ,Set (abstract data type) ,expert knowledge ,Multiple time dimensions ,FOS: Mathematics ,Isotonic regression ,Mathematics - Optimization and Control ,68T30, 90C34 ,Numerical Analysis ,manufacturing ,informed machine learning ,semi-infinite optimization ,QA75.5-76.95 ,Regression ,Computational Mathematics ,Computational Theory and Mathematics ,Optimization and Control (math.OC) ,Electronic computers. Computer science - Abstract
Optimization in engineering requires appropriate models. In this article, a regression method for enhancing the predictive power of a model by exploiting expert knowledge in the form of shape constraints, or more specifically, monotonicity constraints, is presented. Incorporating such information is particularly useful when the available data sets are small or do not cover the entire input space, as is often the case in manufacturing applications. The regression subject to the considered monotonicity constraints is set up as a semi-infinite optimization problem, and an adaptive solution algorithm is proposed. The method is applicable in multiple dimensions and can be extended to more general shape constraints. It is tested and validated on two real-world manufacturing processes, namely laser glass bending and press hardening of sheet metal. It is found that the resulting models both comply well with the expert's monotonicity knowledge and predict the training data accurately. The suggested approach leads to lower root-mean-squared errors than comparative methods from the literature for the sparse data sets considered in this work., Comment: 22 pages, 6 figures
- Published
- 2021
- Full Text
- View/download PDF