Watch the intro video
Note: if you can't see the video, you might need to allow cookies or disable the add blocker.
Soledad Galli, PhD
Instructor
Sole is a lead data scientist, instructor and developer of open source software. She created and maintains the Python library for feature engineering Feature-engine, which allows us to impute data, encode categorical variables, transform, create and select features. Sole is also the author of the book "Python Feature engineering Cookbook" by Packt editorial.
Course description
Welcome to Feature Selection for Machine Learning, the most comprehensive course on feature selection available online.
In this course, you will learn how to select the variables in your data set and build simpler, faster, more reliable and more interpretable machine learning models.
Who is this course for?
You’ve given your first steps into data science, you know the most commonly used machine learning models, you probably built a few linear regression or decision tree based models. You are familiar with data pre-processing techniques like removing missing data, transforming variables, encoding categorical variables. At this stage you’ve probably realized that many data sets contain an enormous amount of features, and some of them are identical or very similar, some of them are not predictive at all, and for some others it is harder to say.
You wonder how you can go about to find the most predictive features. Which ones are OK to keep and which ones could you do without? You also wonder how to code the methods in a professional manner. Probably you did your online search and found out that there is not much around there about feature selection. So you start to wonder: how are things really done in tech companies?
This course will help you! This is the most comprehensive online course in variable selection. You will learn a huge variety of feature selection procedures used worldwide in different organizations and in data science competitions, to select the most predictive features.
What will you learn?
I have put together a fantastic collection of feature selection techniques, based on scientific articles, data science competitions and of course my own experience as a data scientist.
Specifically, you will learn:
- How to remove features with low variance
- How to identify redundant features
- How to select features based on statistical tests
- How to select features based on changes in model performance
- How to find predictive features based on importance attributed by models
- How to code procedures elegantly and in a professional manner
- How to leverage the power of existing Python libraries for feature selection
Throughout the course, you are going to learn multiple techniques for each of the mentioned tasks, and you will learn to implement these techniques in an elegant, efficient, and professional manner, using Python, Scikit-learn, pandas and MLXtend.
At the end of the course, you will have a variety of tools to select and compare different feature subsets and identify the ones that returns the simplest, yet most predictive machine learning model. This will allow you to minimize the time to put your predictive models into production.
This comprehensive feature selection course includes about 70 lectures spanning ~8 hours of video, and ALL topics include hands-on Python code examples which you can use for reference and for practice, and re-use in your own projects.
Course Curriculum
- Correlation - Intro (2:41)
- Correlation Feature Selection (5:32)
- Correlation procedures to select features (3:37)
- Correlation | Notebook demo (11:49)
- Basic methods plus Correlation pipeline
- Correlation with Feature-engine (8:01)
- Feature Selection Pipeline with Feature-engine (2:19)
- Additional reading resources
- Filter Methods with other metrics (3:04)
- Univariate model performance metrics (5:52)
- Univariate model performance metrics | Demo (4:23)
- KDD 2009: Select features by target mean encoding (6:39)
- KDD 2009: Select features by mean encoding | Demo (6:59)
- Univariate model performance with Feature-engine (4:54)
- Target Mean Encoding Selection with Feature-engine (5:20)
- Wrapper methods – Intro (6:39)
- MLXtend
- Step forward feature selection (3:14)
- SFS - MLXtend vs Sklearn (4:06)
- Step forward feature selection | MLXtend (6:00)
- Step forward feature selection | sklearn
- Step backward feature selection (3:13)
- Step backward feature selection | MLXtend (5:50)
- Step backward feature selection | Sklearn
- Exhaustive search (2:45)
- Exhaustive search | Demo (3:37)
- Introduction to hybrid methods (1:50)
- Feature Shuffling - Intro (2:41)
- Shuffling features | Demo (8:41)
- Recursive feature elimination - Intro (2:21)
- Recursive feature elimination | Demo (5:42)
- Recursive feature addition - Intro (2:06)
- Recursive feature addition | Demo (2:55)
- Feature Shuffling with Feature-engine (5:39)
- Recursive feature elimination with Feature-engine (4:53)
- Recursive feature addition with Feature-engine (3:22)