PMID- 32623278 OWN - NLM STAT- MEDLINE DCOM- 20210623 LR - 20210623 IS - 1361-8423 (Electronic) IS - 1361-8415 (Linking) VI - 65 DP - 2020 Oct TI - Spatio-temporal visual attention modelling of standard biometry plane-finding navigation. PG - 101762 LID - S1361-8415(20)30126-2 [pii] LID - 10.1016/j.media.2020.101762 [doi] AB - We present a novel multi-task neural network called Temporal SonoEyeNet (TSEN) with a primary task to describe the visual navigation process of sonographers by learning to generate visual attention maps of ultrasound images around standard biometry planes of the fetal abdomen, head (trans-ventricular plane) and femur. TSEN has three components: a feature extractor, a temporal attention module (TAM), and an auxiliary video classification module (VCM). A soft dynamic time warping (sDTW) loss function is used to improve visual attention modelling. Variants of the model are trained on a dataset of 280 video clips, each containing one of the three biometry planes and lasting 3-7 seconds, with corresponding real-time recorded gaze tracking data of an experienced sonographer. We report the performances of the different variants of TSEN for visual attention prediction at standard biometry plane detection. The best model performance is achieved using bi-directional convolutional long-short term memory (biCLSTM) in both TAM and VCM, and it outperforms a previous spatial model on all static and dynamic saliency metrics. As an auxiliary task to validate the clinical relevance of the visual attention modelling, the predicted visual attention maps were used to guide standard biometry plane detection in consecutive US video frames. All spatio-temporal TSEN models achieve higher scores compared to a spatial-only baseline; the best performing TSEN model achieves F1 scores on these standard biometry planes of 83.7%, 89.9% and 81.1%, respectively. CI - Copyright (c) 2020. Published by Elsevier B.V. FAU - Cai, Yifan AU - Cai Y AD - Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Oxford, OX3 7DQ, UK. Electronic address: yifan.cai@eng.ox.ac.uk. FAU - Droste, Richard AU - Droste R AD - Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Oxford, OX3 7DQ, UK. FAU - Sharma, Harshita AU - Sharma H AD - Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Oxford, OX3 7DQ, UK. FAU - Chatelain, Pierre AU - Chatelain P AD - Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Oxford, OX3 7DQ, UK. FAU - Drukker, Lior AU - Drukker L AD - Nuffield Department of Women's & Reproductive Health, University of Oxford, Oxford, OX3 9DU, UK. FAU - Papageorghiou, Aris T AU - Papageorghiou AT AD - Nuffield Department of Women's & Reproductive Health, University of Oxford, Oxford, OX3 9DU, UK. FAU - Noble, J Alison AU - Noble JA AD - Institute of Biomedical Engineering, Department of Engineering Science, University of Oxford, Oxford, OX3 7DQ, UK. LA - eng PT - Journal Article PT - Research Support, Non-U.S. Gov't DEP - 20200620 PL - Netherlands TA - Med Image Anal JT - Medical image analysis JID - 9713490 SB - IM MH - *Biometry MH - Head MH - Humans MH - *Neural Networks, Computer MH - Ultrasonography OTO - NOTNLM OT - Fetal ultrasound OT - Gaze tracking OT - Multi-task learning OT - Saliency prediction OT - Standard plane detection COIS- Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. EDAT- 2020/07/06 06:00 MHDA- 2021/06/24 06:00 CRDT- 2020/07/06 06:00 PHST- 2019/10/22 00:00 [received] PHST- 2020/06/15 00:00 [revised] PHST- 2020/06/18 00:00 [accepted] PHST- 2020/07/06 06:00 [pubmed] PHST- 2021/06/24 06:00 [medline] PHST- 2020/07/06 06:00 [entrez] AID - S1361-8415(20)30126-2 [pii] AID - 10.1016/j.media.2020.101762 [doi] PST - ppublish SO - Med Image Anal. 2020 Oct;65:101762. doi: 10.1016/j.media.2020.101762. Epub 2020 Jun 20.