PMID- 18244382 OWN - NLM STAT- PubMed-not-MEDLINE DCOM- 20100628 LR - 20161020 IS - 1045-9227 (Print) IS - 1045-9227 (Linking) VI - 12 IP - 2 DP - 2001 TI - Divide-and-conquer learning and modular perceptron networks. PG - 250-63 LID - 10.1109/72.914522 [doi] AB - A novel modular perceptron network (MPN) and divide-and-conquer learning (DCL) schemes for the design of modular neural networks are proposed. When a training process in a multilayer perceptron falls into a local minimum or stalls in a flat region, the proposed DCL scheme is applied to divide the current training data region into two easier to be learned regions. The learning process continues when a self-growing perceptron network and its initial weight estimation are constructed for one of the newly partitioned regions. Another partitioned region will resume the training process on the original perceptron network. Data region partitioning, weight estimating and learning are iteratively repeated until all the training data are completely learned by the MPN. We evaluated and compared the proposed MPN with several representative neural networks on the two-spirals problem and real-world dataset. The MPN achieved better weight learning performance by requiring much less data presentations during the network training phases, and better generalization performance, and less processing time during the retrieving phase. FAU - Fu, H C AU - Fu HC AD - Department of Computer Science and Information Engineering, National Chiao Tung University, Hsinchu, Taiwan 300, R.O.C. FAU - Lee, Y P AU - Lee YP FAU - Chiang, C C AU - Chiang CC FAU - Pao, H T AU - Pao HT LA - eng PT - Journal Article PL - United States TA - IEEE Trans Neural Netw JT - IEEE transactions on neural networks JID - 101211035 EDAT- 2008/02/05 09:00 MHDA- 2008/02/05 09:01 CRDT- 2008/02/05 09:00 PHST- 2008/02/05 09:00 [pubmed] PHST- 2008/02/05 09:01 [medline] PHST- 2008/02/05 09:00 [entrez] AID - 10.1109/72.914522 [doi] PST - ppublish SO - IEEE Trans Neural Netw. 2001;12(2):250-63. doi: 10.1109/72.914522.