Dissertation defense (June 30, 2023): Claudio Andre da Silva Alves

Student: Claudio Andre da Silva Alves

Title: An Adaptive Hybrid Genetic Algorithm for Hyperparameter Optimization

Advisor: Pedro Henrique González Silva

Day: June 30, 2023

Abstract: The recent spike in the popularity of machine learning (ML) applications has led to an increased demand for efficient ML models. One of the key steps in building such models is selecting a well-suited set of hyperparameters. However, with the increasing complexity of models and training techniques, manually defining these parameters has become a labor-intensive task, requiring a significant amount of time and specific knowledge about the model being tuned. To address this challenge, the community of AutoML is focusing on devising ways to automatically find the best set of hyperparameters for ML algorithms through its research area called hyperparameter optimization (HPO). Recently, Hybrid Biased Random-Key Genetic Algorithm (HBRKGA), a Genetic Algorithm (GA) that uses surrogate optimization functions at the exploitation step, has been used to efficiently find automatic hyperparameters for different datasets. However, its potential has not been fully explored as HBRKGA uses only one fixed surrogate function at the exploitation step. This research presents a novel approach for HPO of ML models based on HBRKGA. A method called Adaptive HBRKGA (A-HBRKGA) is devised to improve the probability of finding the best solution. This method is based on the principle that different evolutionary steps require different optimization functions, which allows HBRKGA to have multiple surrogate functions that are chosen based on past evaluations. The approach has been tested against several publicly available datasets and it is shown that it presents better results when compared to other methods in the literature.

 

Dissertação