PMID- 36740003 OWN - NLM STAT- MEDLINE DCOM- 20230307 LR - 20230314 IS - 1096-0309 (Electronic) IS - 0003-2697 (Linking) VI - 666 DP - 2023 Apr 1 TI - DapNet-HLA: Adaptive dual-attention mechanism network based on deep learning to predict non-classical HLA binding sites. PG - 115075 LID - S0003-2697(23)00040-4 [pii] LID - 10.1016/j.ab.2023.115075 [doi] AB - Human leukocyte antigen (HLA) plays a vital role in immunomodulatory function. Studies have shown that immunotherapy based on non-classical HLA has essential applications in cancer, COVID-19, and allergic diseases. However, there are few deep learning methods to predict non-classical HLA alleles. In this work, an adaptive dual-attention network named DapNet-HLA is established based on existing datasets. Firstly, amino acid sequences are transformed into digital vectors by looking up the table. To overcome the feature sparsity problem caused by unique one-hot encoding, the fused word embedding method is used to map each amino acid to a low-dimensional word vector optimized with the training of the classifier. Then, we use the GCB (group convolution block), SENet attention (squeeze-and-excitation networks), BiLSTM (bidirectional long short-term memory network), and Bahdanau attention mechanism to construct the classifier. The use of SENet can make the weight of the effective feature map high, so that the model can be trained to achieve better results. Attention mechanism is an Encoder-Decoder model used to improve the effectiveness of RNN, LSTM or GRU (gated recurrent neural network). The ablation experiment shows that DapNet-HLA has the best adaptability for five datasets. On the five test datasets, the ACC index and MCC index of DapNet-HLA are 4.89% and 0.0933 higher than the comparison method, respectively. According to the ROC curve and PR curve verified by the 5-fold cross-validation, the AUC value of each fold has a slight fluctuation, which proves the robustness of the DapNet-HLA. The codes and datasets are accessible at https://github.com/JYY625/DapNet-HLA. CI - Copyright (c) 2023 Elsevier Inc. All rights reserved. FAU - Jing, Yuanyuan AU - Jing Y AD - School of Mathematics and Statistics, Xidian University, Xi'an, 710071, PR China. FAU - Zhang, Shengli AU - Zhang S AD - School of Mathematics and Statistics, Xidian University, Xi'an, 710071, PR China. Electronic address: shengli0201@163.com. FAU - Wang, Houqiang AU - Wang H AD - School of Mathematics and Statistics, Xidian University, Xi'an, 710071, PR China. LA - eng PT - Journal Article PT - Research Support, Non-U.S. Gov't DEP - 20230203 PL - United States TA - Anal Biochem JT - Analytical biochemistry JID - 0370535 RN - 0 (Histocompatibility Antigens Class I) RN - 0 (HLA Antigens) SB - IM MH - Humans MH - *Deep Learning MH - *COVID-19 MH - Histocompatibility Antigens Class I/metabolism MH - HLA Antigens MH - Binding Sites OTO - NOTNLM OT - Bahdanau attention mechanism OT - Non-classical HLA binding sites OT - SENet attention mechanism OT - Word embedding COIS- Declaration of competing interest The authors declare that they have no conflict of interest. EDAT- 2023/02/06 06:00 MHDA- 2023/03/08 06:00 CRDT- 2023/02/05 19:25 PHST- 2022/11/14 00:00 [received] PHST- 2023/01/30 00:00 [revised] PHST- 2023/02/02 00:00 [accepted] PHST- 2023/02/06 06:00 [pubmed] PHST- 2023/03/08 06:00 [medline] PHST- 2023/02/05 19:25 [entrez] AID - S0003-2697(23)00040-4 [pii] AID - 10.1016/j.ab.2023.115075 [doi] PST - ppublish SO - Anal Biochem. 2023 Apr 1;666:115075. doi: 10.1016/j.ab.2023.115075. Epub 2023 Feb 3.