Please use this identifier to cite or link to this item:
Title: Reduced pattern training based on task decomposition using pattern distributor
Authors: Guan, SU
Bao, C
Neo, T
Keywords: Cross-talk based combination;Full pattern training;Genetic algorithm based combination;Pattern distributor;Reduced pattern training;Task decomposition
Issue Date: 2007
Publisher: IEEE
Citation: IEEE Transaction on Neural Networks. In press
Abstract: Task Decomposition with Pattern Distributor (PD) is a new task decomposition method for multilayered feedforward neural networks. Pattern distributor network is proposed that implements this new task decomposition method. We propose a theoretical model to analyze the performance of pattern distributor network. A method named Reduced Pattern Training is also introduced, aiming to improve the performance of pattern distribution. Our analysis and the experimental results show that reduced pattern training improves the performance of pattern distributor network significantly. The distributor module’s classification accuracy dominates the whole network’s performance. Two combination methods, namely Cross-talk based combination and Genetic Algorithm based combination, are presented to find suitable grouping for the distributor module. Experimental results show that this new method can reduce training time and improve network generalization accuracy when compared to a conventional method such as constructive backpropagation or a task decomposition method such as Output Parallelism.
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Computer Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Reduced Pattern Training Based on Task Decomposition using Pattern Distributor.pdf221.51 kBAdobe PDFView/Open

Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.