View Article |
Optimisation of neural network with simultaneous feature selection and network prunning using evolutionary algorithm
Wong, WK1, Chekima, Ali2, Wong, Kii Ing3, Law, Kah Haw4, Lee, Vincent5.
Most advances on the Evolutionary Algorithm
optimisation of Neural Network are on recurrent neural network
using the NEAT optimisation method. For feed forward network,
most of the optimisation are merely on the Weights and the bias
selection which is generally known as conventional
Neuroevolution. In this research work, a simultaneous feature
reduction, network pruning and weight/biases selection is
presented using fitness function design which penalizes selection of
large feature sets. The fitness function also considers feature and
the neuron reduction in the hidden layer. The results were
demonstrated using two sets of data sets which are the cancer
datasets and Thyroid datasets. Results showed backpropagation
gradient descent error weights/biased optimisations performed
slightly better at classification of the two datasets with lower
misclassification rate and error. However, features and hidden
neurons were reduced with the simultaneous feature /neurons
switching using Genetic Algorithm. The number of features were
reduced from 21 to 4 (Thyroid dataset) and 9 to 3 (cancer dataset)
with only 1 hidden neuron in the processing layer for both network
structures for the respective datasets. This research work will
present the chromosome representation and the fitness function
design.
Affiliation:
- Curtin University Sarawak, Malaysia
- Curtin University Sarawak, Malaysia
- Curtin University Sarawak, Malaysia
- Curtin University Sarawak, Malaysia
- Universiti Malaysia Sabah, Malaysia
Download this article (This article has been downloaded 63 time(s))
|
|
Indexation |
Indexed by |
MyJurnal (2019) |
H-Index
|
0 |
Immediacy Index
|
0.000 |
Rank |
0 |
Indexed by |
Scopus (SCImago Journal Rankings 2016) |
Impact Factor
|
0 |
Rank |
Q4 (Computer Networks and Communications) Q4 (Electrical and Electronic Engineering) Q4 (Hardware and Architecture) |
Additional Information |
0.112 (SJR) |
|
|
|