Jul 7, 2019
Introduction More often than not, the difference between a crappy and powerful implementation of a Machine Learning (ML) algorithm is the choice of its hyperparameters. Hyperparameter Tuning was considered an artistic skill that ML practitioners acquired with experience. Over the past few years several advancements have been made in this area to perform hyperparameter tuning in a more informed manner. Techniques such as Bayesian Optimization, Neural Architecture Search, Probabilistic Matrix Factorization have been developed in the recent past to tackle hyperparameter tuning with great success.