Please use this identifier to cite or link to this item:
Title: Output Partitioning of Neural Networks
Authors: Guan, SU
Qi, Y
Tan, SK
Li, S
Keywords: Constructive learning algorithm;Neural networks;Output partitioning
Issue Date: 2005
Publisher: Elsevier
Citation: Neurocomputing 68: 38-53, Oct 2005
Abstract: Many constructive learning algorithms have been proposed to find an appropriate network structure for a classification problem automatically. Constructive learning algorithms have drawbacks especially when used for complex tasks and modular approaches have been devised to solve these drawbacks. At the same time, parallel training for neural networks with fixed configurations has also been proposed to accelerate the training process. A new approach that combines advantages of constructive learning and parallelism, output partitioning, is presented in this paper. Classification error is used to guide the proposed incremental-partitioning algorithm, which divides the original dataset into several smaller sub-datasets with distinct classes. Each sub-dataset is then handled in parallel, by a smaller constructively trained sub-network which uses the whole input vector and produces a portion of the final output vector where each class is represented by one unit. Three classification datasets are used to test the validity of this method, and results show that this method reduces the classification test error.
ISSN: 0925-2312
Appears in Collections:Electronic and Computer Engineering
Dept of Electronic and Computer Engineering Research Papers

Files in This Item:
File Description SizeFormat 
NEUCOM-D-04-00363 Output Partitioning of Neural Networks final.pdf105.04 kBAdobe PDFView/Open

Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.