Affiliation:
1. Harbin Normal University, Harbin, China
2. High-tech Institute, Qingzhou, China
Abstract
Medical assisted decision-making plays a key role in providing accurate and reliable medical advice. But in medical decision-making, various uncertainties are often accompanied. The belief rule base (BRB) has a strong nonlinear modeling capability and can handle uncertainties well. However, BRB suffers from combinatorial explosion and tends to influence explainability during the optimization process. Therefore, an interval belief rule base with explainability (IBRB-e) is explored in this paper. Firstly, pre-processing using extreme gradient boosting (XGBoost) is performed to filter out features with lower importance. Secondly, based on the filtered features, explainability criterion is defined. Thirdly, evidence reasoning (ER) rule is chosen as an inference tool, while projection covariance matrix adaptive evolutionary strategy (P-CMA-ES) algorithm with explainability constraints is chosen as an optimization algorithm. Lastly, the validation of the model is performed through a breast cancer case. The experimental results show that IBRB-e has good explainability while maintaining high accuracy.
Reference40 articles.
1. Artificial intelligence in disease diagnostics: A critical review and classification on the current state of research guiding future direction;Milad;Health and Technology,2021
2. XGBoost model for chronic kidney disease diagnosis;Adeola;IEEE/ACM Transactions on Computational Biology and Bioinformatics,2019
3. J.S.M.D.A. Chinthaka and G.U. Ganegoda, Involvement of machine learning tools in healthcare decision making, Journal of Healthcare Engineering 2021 (2021).
4. F. Jiang, Y. Jiang, H. Zhi, Y. Dong, H. Li, S.F. Ma, Y.L. Wang, Q. Dong, H.P. Shen and Y.J. Wang, Artificial intelligence in healthcare: Past, present and future, Stroke and Vascular Neurology 2(4) (2017).
5. Data-driven modelling: Some past experiences and new approaches;Solomatine;Journal of Hydroinformatics,2008