- Title
- Evolutionary Hyperparameter Optimisation for Sentence Classification
- Creator
- Rogers, Brendan; Noman, Nasimul; Chalup, Stephan; Moscato, Pablo
- Relation
- 2021 IEEE Congress on Evolutionary Computation (CEC). Proceedings of the IEEE Congress on Evolutionary Computation (IEEE CEC) (Krakow, Poland 28 June - 01 July, 2021) p. 958-965
- Publisher Link
- http://dx.doi.org/10.1109/CEC45853.2021.9504719
- Publisher
- Institute of Electrical and Electronics Engineers (IEEE)
- Resource Type
- conference paper
- Date
- 2021
- Description
- The performance that Deep Neural Networks can achieve on a specific task is impacted significantly by the hyperparameters selected. In order to compare the performance of various Deep Neural Network architectures on sentence classification, an optimised set of hyperparameters needs to be found for each architecture. In this work we use a simple Genetic Algorithm to optimise the hyperparameters of three different architectures and we evaluate their performance on a suite of sentence classification benchmarks. We found that a single Genetic Algorithm is capable of optimising a variety of different architectures and that the evolved configurations found can compete with those chosen by experts while using fewer overall trainable parameters. The three architectures tested are a recurrent neural network and two types of convolutional networks with a large difference in complexity. Of the three architectures, optimised for sentence classification, a simple Convolutional Neural Network was the overall best performer consistently achieving good performance while using very few trainable parameters.
- Subject
- deep learning; recurrent neural networks; computer architecture; evolutionary computation; benchmark testing; network architecture
- Identifier
- http://hdl.handle.net/1959.13/1437596
- Identifier
- uon:40403
- Identifier
- ISBN:9781728183930
- Language
- eng
- Reviewed
- Hits: 997
- Visitors: 967
- Downloads: 0