Efficient neuroevolution through accumulation of experience: Growing networks using function preserving mutations
dc.contributor.author | Löf, Axel | |
dc.contributor.department | Chalmers tekniska högskola / Institutionen för mekanik och maritima vetenskaper | sv |
dc.contributor.department | Chalmers University of Technology / Department of Mechanics and Maritime Sciences | en |
dc.date.accessioned | 2019-07-05T11:53:15Z | |
dc.date.available | 2019-07-05T11:53:15Z | |
dc.date.issued | 2019 | |
dc.description.abstract | In deep supervised learning the structure of the artificial neural network determines how well and how fast it can be trained. This thesis uses evolutionary algorithms to optimize the structure of artificial neural networks. Specifically, the focus of this thesis is to develop strategies for efficient neuroevolution. The neuroevolutionary method presented in this report builds structures through architechtural morphisms that, approximately, preserve the functionality of the networks. The intended outcome of basing the mutations on the idea of function preservation was that new architechtures would start out in a high performance parameter space region. By skipping regions of low performance, the training of previous generations can be accumulated. The proposed method was evaluated relative to version in which the preservating property of the mutations was removed. In the ablated version the parameters associated with the new structural change were randomly initialized. The two versions were benchmarked on five different regression problems. On the three most difficult problems the ablated version demonstrated better performance than the preservering version, while similar performance was observed for the two other problems. The performance difference between the two versions was inferred to a more frequent tendency for the function preserving version to get entrapped in stationary regions, compared to the ablated version. The parameter initializations associated with the ablated version allow the backpropagation to more easily escape these stationary regions. The main contribution of this work is the conclusion that in order to efficiently utilize function preserving transformations to build structures in neuroevolution there need to be some mechanism that allows the backpropagation to esacpe stationary regions. The method is expected to improve by perturbating the parameters of the networks in a way that increase the gradient. | |
dc.identifier.uri | https://hdl.handle.net/20.500.12380/256911 | |
dc.language.iso | eng | |
dc.relation.ispartofseries | Master's thesis - Department of Mechanics and Maritime Sciences : 2019:13 | |
dc.setspec.uppsok | Technology | |
dc.subject | Annan data- och informationsvetenskap | |
dc.subject | Transport | |
dc.subject | Other Computer and Information Science | |
dc.subject | Transport | |
dc.title | Efficient neuroevolution through accumulation of experience: Growing networks using function preserving mutations | |
dc.type.degree | Examensarbete för masterexamen | sv |
dc.type.degree | Master Thesis | en |
dc.type.uppsok | H | |
local.programme | Complex adaptive systems (MPCAS), MSc |
Ladda ner
Original bundle
1 - 1 av 1
Hämtar...
- Namn:
- 256911.pdf
- Storlek:
- 1.17 MB
- Format:
- Adobe Portable Document Format
- Beskrivning:
- Fulltext