Skip to main content
Fig. 5 | BMC Pulmonary Medicine

Fig. 5

From: Predicting postoperative pulmonary infection in elderly patients undergoing major surgery: a study based on logistic regression and machine learning models

Fig. 5

Global feature importance of the RF model using four interpretability methods, ranked in descending order. (A) Ranking of feature importance indicated by Feature Importance plot. The matrix plot depicts displays the mean importance of each feature based on the reduction in impurity or information gain. (B) Ranking of feature importance indicated by Permutation Importance plot. The plot showing the mean decrease in model performance when each feature is randomly permuted. (C) Ranking of feature importance indicated by SHAP summary plot. The mean SHAP values for each feature represent their contribution to the model’s output. (D) Ranking of feature importance indicated by LIME plot. The mean LIME values approximate the model’s behavior locally using an interpretable surrogate model. RF, random forest; SHAP, Shapley additive explanation; LIME: local interpretable model-agnostic explanations

Back to article page