Please use this identifier to cite or link to this item:
|Title:||Reduced Pattern Training Based on Task Decomposition using Pattern Distributor|
|Keywords:||Cross-talk Based Combination, Full Pattern Training, Genetic Algorithm Based Combination, Pattern Distributor, Reduced Pattern Training, Task Decomposition|
|Citation:||IEEE Transactions on Neural Networks,18(6): 1738-1749, Nov 2007|
|Abstract:||Task Decomposition with Pattern Distributor (PD) is a new task decomposition method for multilayered feedforward neural networks. Pattern distributor network is proposed that implements this new task decomposition method. We propose a theoretical model to analyze the performance of pattern distributor network. A method named Reduced Pattern Training is also introduced, aiming to improve the performance of pattern distribution. Our analysis and the experimental results show that reduced pattern training improves the performance of pattern distributor network significantly. The distributor module’s classification accuracy dominates the whole network’s performance. Two combination methods, namely Cross-talk based combination and Genetic Algorithm based combination, are presented to find suitable grouping for the distributor module. Experimental results show that this new method can reduce training time and improve network generalization accuracy when compared to a conventional method such as constructive backpropagation or a task decomposition method such as Output Parallelism.|
|Appears in Collections:||Electronic and Computer Engineering|
Dept of Electronic and Computer Engineering Research Papers
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.