Traditional hyperparameter tuning techniques such as Random Search and GridSearch look around the full space of available hyperparameter values without learning from past results. With a large parameter space, tuning by means of these techniques can become a time-consuming challenge. The search space grows multiplicative with the number of parameters tuned.
Hyperparameter tuning by means of the Bayesian Optimisation approach can bring down the run time to find the optimal hyperparameter set. While choosing the hyperparameter sets, it takes into account the learning from hyperparameters it has seen so far.
Hyperopt is one of the libraries in Python that helps you to tune hyperparameters by means of Bayesian Optimization.
Key Takeaways for the Audience
- Understanding of Working of Bayesian optimization for hyperparameter tuning
- Application of Bayesian optimization by means of Hyperopt
- Applying Hyperopt on a basic polynomial function to find minima
- Tuning Lightgbm hyperparameter using Hyperopt