LI Qiaoqiao, ZHANG Xiumin, HOU Ming, LU Shuang, REN Yi, WANG Jiaojiao, GUO Zifei, ZHANG Li. Study on diagnosis of sacrococcygeal intraoperative acquired pressure injury using ultrasound and convolutional neural networksJ. Journal of Clinical Medicine in Practice, 2026, 30(1): 20-26, 34. DOI: 10.7619/jcmp.20255289
Citation: LI Qiaoqiao, ZHANG Xiumin, HOU Ming, LU Shuang, REN Yi, WANG Jiaojiao, GUO Zifei, ZHANG Li. Study on diagnosis of sacrococcygeal intraoperative acquired pressure injury using ultrasound and convolutional neural networksJ. Journal of Clinical Medicine in Practice, 2026, 30(1): 20-26, 34. DOI: 10.7619/jcmp.20255289

Study on diagnosis of sacrococcygeal intraoperative acquired pressure injury using ultrasound and convolutional neural networks

  • Objective To construct an artificial intelligence model for the rapid and efficient diagnosis of sacrococcygeal intraoperative acquired pressure injury (IAPI) by combining ultrasound images with a convolutional neural network (CNN).
    Methods A total of 3 182 ultrasound images of the sacrococcygeal region were collected from 893 patients who underwent surgery in the supine position at People's Hospital of Xinjiang Uygur Autonomous Region from December 2024 to June 2025. Two senior ultrasound experts independently interpreted the images, and 798 IAPI images and 887 non-IAPI images with consistent interpretations were included in this study. The images of two groups were randomly divided into training (n=1 178), testing (n=255), and validation (n=252) sets in a ratio of 7∶1.5∶1.5. Deep learning models for the automated diagnosis of IAPI ultrasound images were constructed usingResNet50, ResNet101, DenseNet121, and DenseNet161 neural network frameworks. The accuracy, sensitivity, precision, recall, and F1 score of the models' diagnoses were evaluated. The best-performing model was selected for validation and its applicability was assessed. Additionally, the key regions for discriminating and diagnosing IAPI ultrasound images using the best model were analyzed.
    Results Among the four models, DenseNet161 with its dense connections, exhibited the best performance, demonstrating excellent results on the independent testing set with an overall accuracy of 0.937. The precision, recall, and F1 score in the IAPI data were 0.957, 0.909, and 0.932, respectively. Further evaluation of the overall performance based on thresholds showed that the area under the receiver operating characteristic curve (AUC) and the AUC of the precision-recall curve for the model were 0.974 and 0.976, respectively, indicating the model's good ability to distinguish between categories and its stability in accurately identifying positive samples. A in-depth analysis of the best model was conducted using four analytical tools (Grad-CAM, Captum, Lime, and SHAP) and multiple semantic extraction clustering analyses. The results showed that the model could accurately capture the key regions related to IAPI ultrasound images. When using the best model for semantic extraction clustering analysis of the entire dataset, the UMAP with the best performance had a silhouette coefficient of 0.784 for positive and negative samples.
    Conclusion This study successfully explores and establishes a CNN-based model combined with ultrasound images for the automated and rapid diagnosis of sacrococcygeal IAPI, providing a theoretical basis for the feasibility and broad prospects of applying deep learning in the field of IAPI diagnosis and treatment.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return