Efficient neuroevolution through accumulation of experience: Growing networks using function preserving mutations

Examensarbete för masterexamen

Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.12380/256911
Download file(s):
File Description SizeFormat 
256911.pdfFulltext1.2 MBAdobe PDFView/Open
Type: Examensarbete för masterexamen
Master Thesis
Title: Efficient neuroevolution through accumulation of experience: Growing networks using function preserving mutations
Authors: Löf, Axel
Abstract: In deep supervised learning the structure of the artificial neural network determines how well and how fast it can be trained. This thesis uses evolutionary algorithms to optimize the structure of artificial neural networks. Specifically, the focus of this thesis is to develop strategies for efficient neuroevolution. The neuroevolutionary method presented in this report builds structures through architechtural morphisms that, approximately, preserve the functionality of the networks. The intended outcome of basing the mutations on the idea of function preservation was that new architechtures would start out in a high performance parameter space region. By skipping regions of low performance, the training of previous generations can be accumulated. The proposed method was evaluated relative to version in which the preservating property of the mutations was removed. In the ablated version the parameters associated with the new structural change were randomly initialized. The two versions were benchmarked on five different regression problems. On the three most difficult problems the ablated version demonstrated better performance than the preservering version, while similar performance was observed for the two other problems. The performance difference between the two versions was inferred to a more frequent tendency for the function preserving version to get entrapped in stationary regions, compared to the ablated version. The parameter initializations associated with the ablated version allow the backpropagation to more easily escape these stationary regions. The main contribution of this work is the conclusion that in order to efficiently utilize function preserving transformations to build structures in neuroevolution there need to be some mechanism that allows the backpropagation to esacpe stationary regions. The method is expected to improve by perturbating the parameters of the networks in a way that increase the gradient.
Keywords: Annan data- och informationsvetenskap;Transport;Other Computer and Information Science;Transport
Issue Date: 2019
Publisher: Chalmers tekniska högskola / Institutionen för mekanik och maritima vetenskaper
Chalmers University of Technology / Department of Mechanics and Maritime Sciences
Series/Report no.: Master's thesis - Department of Mechanics and Maritime Sciences : 2019:13
URI: https://hdl.handle.net/20.500.12380/256911
Collection:Examensarbeten för masterexamen // Master Theses



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.