PMID- 37247563 OWN - NLM STAT- MEDLINE DCOM- 20230615 LR - 20230615 IS - 1879-2057 (Electronic) IS - 0001-4575 (Linking) VI - 189 DP - 2023 Sep TI - A railway intrusion detection method based on decomposition and semi-supervised learning for accident protection. PG - 107124 LID - S0001-4575(23)00171-9 [pii] LID - 10.1016/j.aap.2023.107124 [doi] AB - In recent years, video surveillance has become increasingly popular in railway intrusion detection. However, it is still quite challenging to detect the intruded objects efficiently and accurately because: (a) The backgrounds of video frames generated by the fixed cameras are similar and only few intrusive frames are available, resulting in a lack of diversity among video frames, and further leading to over fitting of the detection models during training; (b) The intrusion of small targets or targets far from the location of camera exhibits sparsity relative to the wide monitoring view of the camera, which challenges the detection of such targets in a complex background; (c) The extreme imbalance between non-intrusive frames and intrusive frames, as well as a large number of unlabeled frames, hinder the effective training of the detection model and weaken its capacity of generalization. To tackle the above issue, this article develops an effective intrusion detection method by combining low-rank and sparse decomposition (LRSD) and Semi-supervised Support Vector Domain Description (Semi-SVDD). Firstly, LRSD is used to decompose the monitored video into a background and a foreground. Then, based on the semantic segmentation method, we extract the mask of the track region in the decomposed background, which is used to mask the foreground. Next, by using both the labeled and unlabeled frames of the masked foreground, Semi-SVDD is established for the intrusion detection. Numerical results show that the removal of background interference and the combination of the labeled and unlabeled information help to improve the performance of the proposed method, and thus superior to benchmark methods. CI - Copyright (c) 2023 Elsevier Ltd. All rights reserved. FAU - Li, Bin AU - Li B AD - School of Traffic and Transportation, Lanzhou Jiaotong University, Lanzhou 730070, PR China; CHN Energy Technology & Economics Research Institute, Beijing 102211, PR China. Electronic address: 17230022@chnenergy.com.cn. FAU - Tan, Lei AU - Tan L AD - State Key Laboratory of Rail Traffic Control and Safety, Beijing Jiaotong University, Beijing 100044, PR China; Beijing Municipal Engineering Research Institute, Beijing 100037, PR China. Electronic address: 17111038@bjtu.edu.cn. FAU - Wang, Feng AU - Wang F AD - State Key Laboratory of Rail Traffic Control and Safety, Beijing Jiaotong University, Beijing 100044, PR China; School of Mechanical, Electronic and Control Engineering, Beijing Jiaotong University, Beijing 100044, PR China. Electronic address: feng.wang@bjtu.edu.cn. FAU - Liu, Linzhong AU - Liu L AD - School of Traffic and Transportation, Lanzhou Jiaotong University, Lanzhou 730070, PR China. Electronic address: liulinzhong@tsinghua.org.cn. LA - eng PT - Journal Article DEP - 20230527 PL - England TA - Accid Anal Prev JT - Accident; analysis and prevention JID - 1254476 SB - IM MH - Humans MH - *Algorithms MH - *Accidents, Traffic/prevention & control MH - Supervised Machine Learning OTO - NOTNLM OT - Decomposition OT - Railway intrusion detection OT - Semi-supervised learning OT - Track regions COIS- Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. EDAT- 2023/05/30 01:06 MHDA- 2023/06/15 06:42 CRDT- 2023/05/29 18:05 PHST- 2023/03/30 00:00 [received] PHST- 2023/05/03 00:00 [revised] PHST- 2023/05/18 00:00 [accepted] PHST- 2023/06/15 06:42 [medline] PHST- 2023/05/30 01:06 [pubmed] PHST- 2023/05/29 18:05 [entrez] AID - S0001-4575(23)00171-9 [pii] AID - 10.1016/j.aap.2023.107124 [doi] PST - ppublish SO - Accid Anal Prev. 2023 Sep;189:107124. doi: 10.1016/j.aap.2023.107124. Epub 2023 May 27.