Information technology for predicting the hysteresis behavior of shape memory alloys based on a stacking ensemble machine learning model

Main Article Content

Dmytro Tymoshchuk
Oleh Yasniy

Abstract

hape Memory Alloys are characterized by a nonlinear hysteretic behavior on the stress–strain (σ–ε) diagram, where the loop area determines the amount of energy dissipated per cycle. In this work, an ensemble Stacking machine learning model was developed to predict the hysteresis behavior of SMAs under cyclic loading at different frequencies (0.5, 1, 3, and 5 Hz). The model was constructed using experimental data from 100–250 loading cycles. Random Forest, Gradient Boosting, Extra Trees, k-Nearest Neighbors (kNN), Support Vector Regression (SVR), and Multilayer Perceptron (MLP) were employed as base algorithms. The ElasticNet model was selected as the meta-learner and tuned using GridSearchCV with GroupKFold validation. This approach ensured the combination of ensemble stability with adaptive selection of the most informative predictions from the base models. The obtained results showed a high accuracy in reproducing the stress–strain relationship: R2 > 0.995, MSE < 0.0007, MAE < 0.02, and MAPE < 1.3 % on the test data. Validation on independent cycles 251 and 300 confirmed the model’s generalization ability, achieving R2 > 0.974, MSE < 0.007, MAE < 0.06, and MAPE < 4.8 %. The interpretability of the model was provided by the SHAP method, which quantitatively determines the contribution of each input feature to the prediction. It was found that Stress is the dominant factor influencing the prediction, while UpDown defines the loading–unloading phase, and Cycle reflects the accumulation of cyclic effects. The developed ensemble Stacking model is an integral component of an information technology framework for predicting the hysteresis behavior of shape memory alloys using machine learning methods. The proposed approach provides not only high prediction accuracy but also a physically grounded interpretability of the results.

Article Details

Section

Articles

References

1. Sharma, K., & Srinivas, G. (2020). Flying smart: Smart materials used in aviation industry. Materials Today: Proceedings, 27, 244–250. https://doi.org/10.1016/j.matpr.2019.10.115

2. Niu, X., Yao, X., & Dong, E. (2025). Design and control of bio-inspired joints for legged robots driven by shape memory alloy wires. Biomimetics, 10(6), 378. https://doi.org/10.3390/biomimetics10060378

3. Schmelter, T., Bade, L., & Kuhlenkötter, B. (2024). A two-finger gripper actuated by shape memory alloy for applications in automation technology with minimized installation space. Actuators, 13(10), 425. https://doi.org/10.3390/act13100425

4. Riccio, A., Sellitto, A., Ameduri, S., Concilio, A., & Arena, M. (2021). Shape memory alloys (SMA) for automotive applications and challenges. In Shape Memory Alloy Engineering (pp. 785–808). Elsevier. https://doi.org/10.1016/b978-0-12-819264-1.00024-8

5. Zhang, H., Zhao, L., Li, A., & Xu, S. (2024). Design and hysteretic performance analysis of a novel multi-layer self-centering damper with shape memory alloy. Buildings, 14(2), 483. https://doi.org/10.3390/buildings14020483

6. Iasnii, V., Krechkovska, H., Budz, V., Student, O., & Lapusta, Y. (2024). Frequency effect on low‐cycle fatigue behavior of pseudoelastic NiTi alloy. Fatigue & Fracture of Engineering Materials & Structures. https://doi.org/10.1111/ffe.14331

7. Tymoshchuk, D., Yasniy, O., Maruschak, P., Iasnii, V., & Didych, I. (2024). Loading Frequency Classification in Shape Memory Alloys: A Machine Learning Approach. Computers, 13(12), 339. https://doi.org/10.3390/computers13120339

8. IBM. (n.d.-b). What is Explainable AI (XAI)? | IBM. https://www.ibm.com/think/topics/explainable-ai

9. Hmede, R., Chapelle, F., & Lapusta, Y. (2022). Review of neural network modeling of shape memory alloys. Sensors, 22(15), 5610. https://doi.org/10.3390/s22155610

10. He, S., Wang, Y., Zhang, Z., Xiao, F., Zuo, S., Zhou, Y., Cai, X., & Jin, X. (2023). Interpretable machine learning workflow for evaluation of the transformation temperatures of TiZrHfNiCoCu high entropy shape memory alloys. Materials & Design, 225, 111513. https://doi.org/10.1016/j.matdes.2022.111513

11. Sridharan, S., Velayutham, R., Behera, S., & Murugesan, J. (2025). Machine Learning-Based Temperature-Induced Phase Transformation Temperature Prediction of Ti-Based High-Temperature Shape Memory Alloy. Journal of Materials Engineering and Performance. https://doi.org/10.1007/s11665-025-11236-z

12. Thiercelin, L., Peltier, L., & Meraghni, F. (2024). Physics-informed machine learning prediction of the martensitic transformation temperature for the design of “NiTi-like” high entropy shape memory alloys. Computational Materials Science, 231, 112578. https://doi.org/10.1016/j.commatsci.2023.112578

13. Lam, T.-N., Jiang, J., Hsu, M.-C., Tsai, S.-R., Luo, M.-Y., Hsu, S.-T., Lee, W.-J., Chen, C.-H., & Huang, E.-W. (2024). Predictions of Lattice Parameters in NiTi High-Entropy Shape-Memory Alloys Using Different Machine Learning Models. Materials, 17(19), 4754. https://doi.org/10.3390/ma17194754

14. Liu, C., & Su, H. (2024). Machine learning aided prediction of martensite transformation temperature of NiTi-based shape memory alloy. Materials Today Communications, 41, 110720. https://doi.org/10.1016/j.mtcomm.2024.110720

15. Iasnii, V., Bykiv, N., Yasniy, O., & Budz, V. (2022). Methodology and some results of studying the influence of frequency on functional properties of pseudoelastic SMA. Scientific journal of the Ternopil national technical university, 107(3), 45–50. https://doi.org/10.33108/visnyk_tntu2022.03.045

16. StackingRegressor. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.StackingRegressor.html

17. RandomForestRegressor. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestRegressor.html

18. Clark, B., & Lee, F. (n.d.). What is Gradient Boosting? | IBM. Retrieved from https://www.ibm.com/think/topics/gradient-boosting

19. ExtraTreesRegressor. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.ExtraTreesRegressor.html

20. Nearest Neighbors. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/neighbors.html

21. Support Vector Machines. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/svm.html

22. Haykin, S. (2009). Neural networks and learning machines (3rd ed.). Hamilton, ON, Canada: Prentice Hall.

23. ElasticNet. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.ElasticNet.html

24. Metrics and scoring: quantifying the quality of predictions. (n.d.). Retrieved from https://scikit-learn.org/stable/modules/model_evaluation.html#model-evaluation

25. GitHub - shap/shap: A game theoretic approach to explain the output of any machine learning model. (n.d.-b). Retrieved from https://github.com/shap/shap