Back to Search
Start Over
Enhancing interpretability of tree-based models for downstream salinity prediction: Decomposing feature importance using the Shapley additive explanation approach
- Source :
- Results in Engineering, Vol 23, Iss , Pp 102373- (2024)
- Publication Year :
- 2024
- Publisher :
- Elsevier, 2024.
-
Abstract
- To improve the interpretability of estimation processes in machine learning, we applied the Shapley additive explanation method to six tree-based models for predicting downstream salinity. Decision tree-based adaptive boosting (AdaBoost) demonstrated the highest performance, followed by extreme gradient boosting (XGBoost). Global post-decomposition in XGBoost identified water level, a major cause of salinity intrusion in actual environments, as the main contributor of salinity intrusion, while an interaction effect between water temperature and sea level pressure was identified as the main contributor in AdaBoost. Local post-decomposition in XGBoost enables visual interpretation of most of the salinity behaviors. However, the interaction effect between water level and sea level pressure identified as the main contributor by XGBoost was unlikely to drive the actual increase in salinity immediately after a heavy rainfall event. We conclude that highly accurate estimations are not necessarily interpretable and reliable and local post-decomposition can detect inconsistencies in estimation processes.
- Subjects :
- XGBoost
AdaBoost
Net SHAP
Main effect
Interaction effect
Salinity intrusion
Technology
Subjects
Details
- Language :
- English
- ISSN :
- 25901230
- Volume :
- 23
- Issue :
- 102373-
- Database :
- Directory of Open Access Journals
- Journal :
- Results in Engineering
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.47c122031d564a8aa4475f54a6c68e23
- Document Type :
- article
- Full Text :
- https://doi.org/10.1016/j.rineng.2024.102373