PMID- 32707825 OWN - NLM STAT- PubMed-not-MEDLINE DCOM- 20200807 LR - 20240329 IS - 1424-8220 (Electronic) IS - 1424-8220 (Linking) VI - 20 IP - 15 DP - 2020 Jul 22 TI - High-Resolution U-Net: Preserving Image Details for Cultivated Land Extraction. LID - 10.3390/s20154064 [doi] LID - 4064 AB - Accurate and efficient extraction of cultivated land data is of great significance for agricultural resource monitoring and national food security. Deep-learning-based classification of remote-sensing images overcomes the two difficulties of traditional learning methods (e.g., support vector machine (SVM), K-nearest neighbors (KNN), and random forest (RF)) when extracting the cultivated land: (1) the limited performance when extracting the same land-cover type with the high intra-class spectral variation, such as cultivated land with both vegetation and non-vegetation cover, and (2) the limited generalization ability for handling a large dataset to apply the model to different locations. However, the "pooling" process in most deep convolutional networks, which attempts to enlarge the sensing field of the kernel by involving the upscale process, leads to significant detail loss in the output, including the edges, gradients, and image texture details. To solve this problem, in this study we proposed a new end-to-end extraction algorithm, a high-resolution U-Net (HRU-Net), to preserve the image details by improving the skip connection structure and the loss function of the original U-Net. The proposed HRU-Net was tested in Xinjiang Province, China to extract the cultivated land from Landsat Thematic Mapper (TM) images. The result showed that the HRU-Net achieved better performance (Acc: 92.81%; kappa: 0.81; F1-score: 0.90) than the U-Net++ (Acc: 91.74%; kappa: 0.79; F1-score: 0.89), the original U-Net (Acc: 89.83%; kappa: 0.74; F1-score: 0.86), and the Random Forest model (Acc: 76.13%; kappa: 0.48; F1-score: 0.69). The robustness of the proposed model for the intra-class spectral variation and the accuracy of the edge details were also compared, and this showed that the HRU-Net obtained more accurate edge details and had less influence from the intra-class spectral variation. The model proposed in this study can be further applied to other land cover types that have more spectral diversity and require more details of extraction. FAU - Xu, Wenna AU - Xu W AUID- ORCID: 0000-0002-2566-2233 AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - University of Chinese Academy of Sciences, Beijing 101407, China. FAU - Deng, Xinping AU - Deng X AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - Shenzhen Engineering Laboratory of Ocean Environmental Big Data Analysis and Application, Shenzhen 518055, China. FAU - Guo, Shanxin AU - Guo S AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - Shenzhen Engineering Laboratory of Ocean Environmental Big Data Analysis and Application, Shenzhen 518055, China. FAU - Chen, Jinsong AU - Chen J AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - Shenzhen Engineering Laboratory of Ocean Environmental Big Data Analysis and Application, Shenzhen 518055, China. FAU - Sun, Luyi AU - Sun L AUID- ORCID: 0000-0003-4575-0836 AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - Shenzhen Engineering Laboratory of Ocean Environmental Big Data Analysis and Application, Shenzhen 518055, China. FAU - Zheng, Xiaorou AU - Zheng X AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - University of Chinese Academy of Sciences, Beijing 101407, China. FAU - Xiong, Yingfei AU - Xiong Y AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - University of Chinese Academy of Sciences, Beijing 101407, China. FAU - Shen, Yuan AU - Shen Y AD - Center for Geo-Spatial Information, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China. AD - University of Chinese Academy of Sciences, Beijing 101407, China. FAU - Wang, Xiaoqin AU - Wang X AD - Key Laboratory of Spatial Data Mining & Information Sharing of Ministry of Education, National & Local Joint Engineering Research Center of Satellite Geospatial Information Technology, Fuzhou University, Fuzhou 350000, China. LA - eng GR - 2017YFB0504203/The National Key Research and Development Program of China/ GR - 41601212, 41801358, 41801360, 41771403/Natural science foundation of China project/ GR - JCYJ20170818155853672/Fundamental Research Foundation of Shenzhen Technology and Innovation Council/ PT - Journal Article DEP - 20200722 PL - Switzerland TA - Sensors (Basel) JT - Sensors (Basel, Switzerland) JID - 101204366 SB - IM PMC - PMC7436155 OTO - NOTNLM OT - U-Net OT - cultivated land extraction OT - deep learning OT - full convolutional network OT - remote sensing COIS- The authors declare no conflict of interest. EDAT- 2020/07/28 06:00 MHDA- 2020/07/28 06:01 PMCR- 2020/08/01 CRDT- 2020/07/26 06:00 PHST- 2020/05/29 00:00 [received] PHST- 2020/07/10 00:00 [revised] PHST- 2020/07/17 00:00 [accepted] PHST- 2020/07/26 06:00 [entrez] PHST- 2020/07/28 06:00 [pubmed] PHST- 2020/07/28 06:01 [medline] PHST- 2020/08/01 00:00 [pmc-release] AID - s20154064 [pii] AID - sensors-20-04064 [pii] AID - 10.3390/s20154064 [doi] PST - epublish SO - Sensors (Basel). 2020 Jul 22;20(15):4064. doi: 10.3390/s20154064.