23rd EANN / EAAAI 2022, 17 - 20 June 2022, Greece

Evaluating Acceleration Techniques for Genetic Neural Architecture Search

Foteini Dervisi, George Kyriakides, Konstantinos Margaritis

Abstract:

  The increase in the available data and computational power has led to the rapid evolution of the field of deep learning over the last few years. However, the success of deep learning methods relies on making appropriate neural architecture choices, which is not a straightforward task and usually requires a time-consuming trial-and-error procedure. Neural architecture search is the process of automating the design of neural network architectures capable of performing well on specific tasks. It is a field that has emerged in order to address the problem of designing efficient neural architectures and is gaining popularity due to the rapid evolution of deep learning, which has led to an increasing need for the discovery of high-performing neural architectures. This paper focuses on evolutionary neural architecture search, which is an efficient but also time-consuming and computationally expensive neural architecture search approach, and aims to pave the way for speeding up such algorithms by assessing the effect of acceleration methods on the overall performance of the neural architecture search procedure as well as on the produced architectures.  

*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.