PMID- 37480679 OWN - NLM STAT- MEDLINE DCOM- 20230914 LR - 20230914 IS - 1879-0534 (Electronic) IS - 0010-4825 (Linking) VI - 164 DP - 2023 Sep TI - Improving adversarial robustness of medical imaging systems via adding global attention noise. PG - 107251 LID - S0010-4825(23)00716-3 [pii] LID - 10.1016/j.compbiomed.2023.107251 [doi] AB - Recent studies have found that medical images are vulnerable to adversarial attacks. However, it is difficult to protect medical imaging systems from adversarial examples in that the lesion features of medical images are more complex with high resolution. Therefore, a simple and effective method is needed to address these issues to improve medical imaging systems' robustness. We find that the attackers generate adversarial perturbations corresponding to the lesion characteristics of different medical image datasets, which can shift the model's attention to other places. In this paper, we propose global attention noise (GATN) injection, including global noise in the example layer and attention noise in the feature layers. Global noise enhances the lesion features of the medical images, thus keeping the examples away from the sharp areas where the model is vulnerable. The attention noise further locally smooths the model from small perturbations. According to the characteristic of medical image datasets, we introduce Global attention lesion-unrelated noise (GATN-UR) for datasets with unclear lesion boundaries and Global attention lesion-related noise (GATN-R) for datasets with clear lesion boundaries. Extensive experiments on ChestX-ray, Dermatology, and Fundoscopy datasets show that GATN improves the robustness of medical diagnosis models against a variety of powerful attacks and significantly outperforms the existing adversarial defense methods. To be specific, the robust accuracy is 86.66% on ChestX-ray, 72.49% on Dermatology, and 90.17% on Fundoscopy under PGD attack. Under the AA attack, it achieves robust accuracy of 87.70% on ChestX-ray, 66.85% on Dermatology, and 87.83% on Fundoscopy. CI - Copyright (c) 2023 Elsevier Ltd. All rights reserved. FAU - Dai, Yinyao AU - Dai Y AD - Zhejiang University of Science and Technology, Hangzhou 310023, China. FAU - Qian, Yaguan AU - Qian Y AD - Zhejiang University of Science and Technology, Hangzhou 310023, China. Electronic address: qianyaguan@zust.edu.cn. FAU - Lu, Fang AU - Lu F AD - Zhejiang University of Science and Technology, Hangzhou 310023, China. FAU - Wang, Bin AU - Wang B AD - Zhejiang Key Laboratory of Multidimensional Perception Technology, Application, and Cybersecurity, Hangzhou 310052, China. Electronic address: wbin2006@gmail.com. FAU - Gu, Zhaoquan AU - Gu Z AD - School of Computer Science and Technology, Harbin Institute of Technology (Shenzhen), Shenzhen 518071, China. FAU - Wang, Wei AU - Wang W AD - Beijing Key Laboratory of Security and Privacy in Intelligent Transportation, Beijing Jiaotong University, Beijing 100091, China. FAU - Wan, Jian AU - Wan J AD - Zhejiang University of Science and Technology, Hangzhou 310023, China. FAU - Zhang, Yanchun AU - Zhang Y AD - Victoria University, Melbourne, VIC 8001, Australia. LA - eng PT - Journal Article PT - Research Support, Non-U.S. Gov't DEP - 20230711 PL - United States TA - Comput Biol Med JT - Computers in biology and medicine JID - 1250250 SB - IM MH - *Diagnostic Imaging MH - *Computer Security OTO - NOTNLM OT - Adversarial attack OT - Medical image OT - Model robustness OT - Noise injection COIS- Declaration of competing interest None Declared. EDAT- 2023/07/23 01:11 MHDA- 2023/09/11 06:42 CRDT- 2023/07/22 18:03 PHST- 2023/04/17 00:00 [received] PHST- 2023/06/14 00:00 [revised] PHST- 2023/07/07 00:00 [accepted] PHST- 2023/09/11 06:42 [medline] PHST- 2023/07/23 01:11 [pubmed] PHST- 2023/07/22 18:03 [entrez] AID - S0010-4825(23)00716-3 [pii] AID - 10.1016/j.compbiomed.2023.107251 [doi] PST - ppublish SO - Comput Biol Med. 2023 Sep;164:107251. doi: 10.1016/j.compbiomed.2023.107251. Epub 2023 Jul 11.