PMID- 37178624 OWN - NLM STAT- MEDLINE DCOM- 20230529 LR - 20230529 IS - 1724-191X (Electronic) IS - 1120-1797 (Linking) VI - 110 DP - 2023 Jun TI - Robust and efficient abdominal CT segmentation using shape constrained multi-scale attention network. PG - 102595 LID - S1120-1797(23)00072-8 [pii] LID - 10.1016/j.ejmp.2023.102595 [doi] AB - PURPOSE: Although many deep learning-based abdominal multi-organ segmentation networks have been proposed, the various intensity distributions and organ shapes of the CT images from multi-center, multi-phase with various diseases introduce new challenges for robust abdominal CT segmentation. To achieve robust and efficient abdominal multi-organ segmentation, a new two-stage method is presented in this study. METHODS: A binary segmentation network is used for coarse localization, followed by a multi-scale attention network for the fine segmentation of liver, kidney, spleen, and pancreas. To constrain the organ shapes produced by the fine segmentation network, an additional network is pre-trained to learn the shape features of the organs with serious diseases and then employed to constrain the training of the fine segmentation network. RESULTS: The performance of the presented segmentation method was extensively evaluated on the multi-center data set from the Fast and Low GPU Memory Abdominal oRgan sEgmentation (FLARE) challenge, which was held in conjunction with International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) 2021. Dice Similarity Coefficient (DSC) and Normalized Surface Dice (NSD) were calculated to quantitatively evaluate the segmentation accuracy and efficiency. An average DSC and NSD of 83.7% and 64.4% were achieved, and our method finally won the second place among more than 90 participating teams. CONCLUSIONS: The evaluation results on the public challenge demonstrate that our method shows promising performance in robustness and efficiency, which may promote the clinical application of the automatic abdominal multi-organ segmentation. CI - Copyright (c) 2023. Published by Elsevier Ltd. FAU - Tong, Nuo AU - Tong N AD - AI-based Big Medical Imaging Data Frontier Research Center, Academy of Advanced Interdisciplinary Research, Xidian University, Xi'an, Shaanxi 710071, China. FAU - Xu, Yinan AU - Xu Y AD - Key Lab of Intelligent Perception and Image Understanding of Ministry of Education, Xidian University, Xi'an, Shaanxi, 710071, China. FAU - Zhang, Jinsong AU - Zhang J AD - Xijing Hospital of Air Force Military Medical University, Xian, Shaanxi 710032, China. FAU - Gou, Shuiping AU - Gou S AD - AI-based Big Medical Imaging Data Frontier Research Center, Academy of Advanced Interdisciplinary Research, Xidian University, Xi'an, Shaanxi 710071, China; Key Lab of Intelligent Perception and Image Understanding of Ministry of Education, Xidian University, Xi'an, Shaanxi, 710071, China. Electronic address: shpgou@mail.xidian.edu.cn. FAU - Li, Mengbin AU - Li M AD - Xijing Hospital of Air Force Military Medical University, Xian, Shaanxi 710032, China. Electronic address: limbin@fmmu.edu.cn. LA - eng PT - Journal Article DEP - 20230511 PL - Italy TA - Phys Med JT - Physica medica : PM : an international journal devoted to the applications of physics to medicine and biology : official journal of the Italian Association of Biomedical Physics (AIFB) JID - 9302888 SB - IM MH - *Neural Networks, Computer MH - *Algorithms MH - Tomography, X-Ray Computed/methods MH - Abdomen/diagnostic imaging MH - Spleen/diagnostic imaging MH - Image Processing, Computer-Assisted/methods OTO - NOTNLM OT - Abdominal CT OT - Channel-wise attention OT - Multi-scale feature OT - Robust multi-organ segmentation OT - Shape constraints COIS- Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. EDAT- 2023/05/14 01:07 MHDA- 2023/05/29 06:41 CRDT- 2023/05/13 18:07 PHST- 2022/09/22 00:00 [received] PHST- 2023/03/02 00:00 [revised] PHST- 2023/04/17 00:00 [accepted] PHST- 2023/05/29 06:41 [medline] PHST- 2023/05/14 01:07 [pubmed] PHST- 2023/05/13 18:07 [entrez] AID - S1120-1797(23)00072-8 [pii] AID - 10.1016/j.ejmp.2023.102595 [doi] PST - ppublish SO - Phys Med. 2023 Jun;110:102595. doi: 10.1016/j.ejmp.2023.102595. Epub 2023 May 11.