Data Specific Neural Architecture Search with Minimal Training
Sammanfattning
Conventional neural architecture search is computationally expensive. Training-free alternatives are not yet capable of designing architectures which are on par with hand-crafted architectures. This work conducts research in the domain of trainingfree neural architecture search towards closing the performance gap to hand-crafted architectures. The best training-free neural architecture search methods to date compute a proxy score for the validation accuracy of architectures which can be computed before the architecture is trained based on the local linear operators of the architectures. This work proposes, to improve over existing methods by basing the score on the mutual dependency of the local linear operators instead of directly
on the local linear operators. The work proposes further, to quantify the mutual dependency of the local linear operators in terms of the parameters of the network. A procedure is implemented to obtain the dependency of the local linear operators
based on the network parameters. The results of the experiments in this work indicate that the mutual dependency of the local linear operators is conclusive in regard to the network’s validation accuracy already before the architecture is trained
and therefore a promising base for a training-free neural architecture search.
Examinationsnivå
Student essay
Samlingar
Fil(er)
Datum
2022-12-05Författare
SIEGERT, LEON
Nyckelord
Neural Architecture Search Local Linear Operators
Training-Free Neural Architecture Search
Neural Architecture Search Without Training
Språk
eng