Supervised Learning with Dynamic Network Architectures

Typ
Examensarbete för masterexamen
Program
Publicerad
2019
Författare
Carlström, Herman
Slottner Seholm, Filip
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
There exists many different techniques for training neural networks, but few are designed to handle the case of lifelong learning. Most models are based on the assumption that there is a training phase with a finite amount of data. This thesis investigates and evaluates a brand new algorithm, namely the Lifelong Learning starting from zero (LL0) which can be used for lifelong learning where there is a continuous stream of data. The algorithm stems from biology, logical rules, and machine learning. The algorithm builds a dynamic artificial neural network architecture over time, based on four different concepts; extension, generalization, forgetting and backpropagation. These first three concepts all have their origin in biology, and can be found in animals and humans in the form of neuroplasticity. The model is evaluated and benchmarked against five other models on different datasets and problems. The obtained results act as a proof of concept for the algorithm. Lastly, the pros and cons of the model are discussed, followed by a discussion on future work. The model proposed outperforms all models on chosen benchmarks, primarily in the area of one-shot learning. The model manages to achieve about 90% accuracy on unseen data after only training on a small portion of the training set on multiple datasets. LL0 also shows promising results in the area of lifelong learning.
Beskrivning
Ämne/nyckelord
Machine Learning , Neural Networks , Dynamic Architectures , Supervised Learning , Lifelong Learning , One-Shot Learning , Transfer Learning , Computer Science
Citation
Arkitekt (konstruktör)
Geografisk plats
Byggnad (typ)
Byggår
Modelltyp
Skala
Teknik / material
Index