Hyperparameter optimization with Python
Find out what you will learn throughout the course (if the video does not show, try allowing cookies in your browser).
What you'll learn
👉 Hyperparameter tuning and why it matters.
👉 Grid and random search for hyperparameters.
👉 Bayesian optimization.
👉 Tree-structured Parzen estimators, population based training, SMAC.
👉 Different cross-validation strategies.
👉 Hyperparameter optimization tools, i.e., Hyperopt, Optuna, Scikit-optimize, Keras Tuner and more.
What you'll get
✔ 10+ hs. of video lectures.
✔ Presentations, quizzes and assignments.
✔ Jupyter notebooks with code.
✔ Instructor support through Q&A.
✔ Access in PC and mobile.
✔ Lifetime access to content.
✔ Certificate of completion.
What they say...
☻ More than 4.5k students enrolled.
☻ More than 300 student reviews.
☻ Average course rating: 4.7 out of 5.
Soledad Galli, PhD
Sole is a lead data scientist, instructor and developer of open source software. She created and maintains the Python library for feature engineering Feature-engine, which allows us to impute data, encode categorical variables, transform, create and select features. Sole is also the author of the book "Python Feature engineering Cookbook" by Packt editorial.
30 days money back guarantee
If you're disappointed for whatever reason, you'll get a full refund.
So you can buy with confidence.
Welcome to Hyperparameter Optimization for Machine Learning, the most comprehensive course on hyperparameter tuning available online. In this course, you will learn multiple techniques to select the best hyperparameters and improve the performance of your machine learning models.
What are hyperparameters?
Hyperparameters are parameters that are not directly learnt by the machine learning algorithm. They define and control the machine learning model, i.e., how flexible the model is to fit the training data, and they are calibrated to avoid over-fitting and improve generalization.
Some examples of hyperparameters are the regularization constants in linear models and support vector machines (SVMs), the number of estimators and the max-depth in decision trees, and the number of nodes or the learning rate in deep neural networks.
What is hyperparameter optimization?
Hyperparameter tuning or hyperparameter optimization is the process of finding the best hyperparameter values for a given machine learning algorithm and a given dataset.
There are various hyperparameter optimization methods, including grid search, random search, and sequential models, usually involving Bayesian optimization.
Hyperparameter optimization algorithms consist of a search space, a search algorithm, a cross-validation scheme to find the optimal values while avoiding over-fitting, and an objective function with the classifier or regression model and the metric to optimize.
Throughout the tutorials, you will learn each and every aspect of the tuning methods.
What will you learn in this online course?
In this course, you will learn multiple hyperparameter optimization methods to find the best set of hyperparameters for your classifier or regression models.
Specifically, you will learn:
- How to define a hyperparameter space from were to sample different hyperparameters
- Different search algorithms that guide the search for hyperparameter values
- Various cross-validation schemes to assess the model performance at each iteration
- How to define an objective function for a specific machine learning model, metric and training data set.
You will learn the following search algorithms:
- Grid search
- Random Search
- Sequential models including Bayesian optimization with Gaussian processes
- Bayesian optimization with tree parzen estimators or random forests
We'll take you step-by-step through engaging video tutorials and teach you everything you need to know about to find the best combination of hyperparameters. Throughout this comprehensive course, we cover almost every available approach to optimize hyperparameters, discussing their rationale, their advantages and shortcomings, the considerations to have when using the technique, and their implementation in Python.
By the end of the course, you will be able to set up machine learning pipelines with the hyperparameter tuning algorithm that best suits your project.
Hyperparameter tuning with Python
Throughout the course, we will use Python as the main language. We will implement the hyperparameter search with the open-source libraries Scikit-learn, Hyperopt, Optuna, Scikit-Optimize and Keras-tuner.
Who is this course for?
If you are regularly training machine learning models as a hobby or for your organization and want to improve the performance of your models; if you are keen to jump up the leader board of a data science competition, or you simply want to learn more about how to tune hyperparameters of machine learning models, this course will show you how.
To get the most out of this course, you need to have a basic knowledge of machine learning and familiarity with the most common predictive models, like linear and logistic regression, decision trees, and random forests, and the metrics used to evaluate model performance. You also need basic knowledge of Python and the open source libraries, Numpy, Pandas, and sklearn.
We will use more advanced models to find the hyperparameters of neural networks. To get the most out of this course, you should be familiar with deep learning and the open source libraries keras and tensorflow.
This comprehensive machine learning course includes over 50 lectures spread across approximately 8 hours of video, and ALL topics include hands-on Python code examples that you can use for reference and practice, as well as re-use in your own projects.
- Cross-Validation (9:15)
- Bias vs Variance (Optional)
- Cross-Validation schemes (13:55)
- Estimating the model generalization error with CV - Demo (8:35)
- Cross-Validation for Hyperparameter Tuning - Demo (7:33)
- Special Cross-Validation schemes (7:07)
- Group Cross-Validation - Demo (5:03)
- Nested Cross-Validation (7:19)
- Nested Cross-Validation - Demo (6:43)
- How are we doing?
- Basic Search Algorithms - Introduction (5:10)
- Manual Search (6:35)
- Grid Search (3:21)
- Grid Search - Demo (7:50)
- Grid Search with different hyperparameter spaces (2:18)
- Random Search (7:34)
- Random Search with Scikit-learn (5:37)
- Random Search with Scikit-Optimize (7:30)
- Random Search with Hyperopt (11:06)
- More examples
- How are we doing?
- Sequential Search (5:49)
- Bayesian Optimization (5:10)
- Bayesian Inference - Introduction (7:11)
- Joint and Conditional Probabilities (7:40)
- Bayes Rule (12:02)
- Sequential Model-Based Optimization (15:54)
- Gaussian Distribution (7:28)
- Multivariate Gaussian Distribution (16:22)
- Gaussian Process (14:47)
- Kernels (6:41)
- Acquisition Functions (13:44)
- Additional Reading Resources
- Scikit-Optimize - 1-Dimension (14:11)
- Scikit-Optimize - Manual Search (5:20)
- Scikit-Optimize - Automatic Search (4:03)
- Scikit-Optimize - Alternative Kernel (3:24)
- Scikit-Optimize - Neuronal Networks (14:17)
- Scikit-Optimize - CNN - Search Analysis (6:00)
- Scikit-Optimize (5:45)
- Section content (2:10)
- Hyperparameter Distributions (4:37)
- Defining the hyperparameter space (2:36)
- Defining the objective function (1:59)
- Random search (5:12)
- Bayesian search with Gaussian processes (5:14)
- Bayesian search with Random Forests (2:53)
- Bayesian search with GBMs (3:03)
- Parallelizing a Bayesian search (2:53)
- Bayesian search with Scikit-learn wrapper (4:03)
- Changing the kernel of a Gaussian Process (3:24)
- Optimizing xgboost
- Optimizing Hyperparameters of a CNN (14:17)
- Analyzing the CNN search (6:00)
- Optuna (4:58)
- Optuna main functions (7:45)
- Section content (1:00)
- Search algorithms (7:38)
- Optimizing multiple ML models with simultaneously (7:21)
- Optimizing hyperparameters of a CNN (9:52)
- Optimizing a CNN - extended (4:48)
- Evaluating the search with Optuna's built in functions (9:41)
- More examples
Frequently Asked Questions
When does the course begin and end?
You can start taking the course from the moment you enroll. The course is self-paced, so you can watch the tutorials and apply what you learn whenever you find it most convenient.
For how long can I access the course?
The courses have lifetime access. This means that once you enroll, you will have unlimited access to the course for as long as you like.
What if I don't like the course?
There is a 30-day money back guarantee. If you don't find the course useful, contact us within the first 30 days of purchase and you will get a full refund.
Will I get a certificate?
Yes, you'll get a certificate of completion after completing all lectures, quizzes and assignments.