Difference between supervised and unsupervised learning models
Note: You can now subscribe to my blog updates here to receive latest updates.
In my introductory post about machine learning (ML), I listed a bunch of ML models by their output (regression, classification and clustering). These models can be classified differently as either supervised or unsupervised learning models.
Supervised learning models
In a supervised learning model, your data consists of independent variable(s) and dependent variable(s). You build your model by feeding it this data. The goal is to have a model which can take values of your independent variable and accurately predict corresponding values of the dependent variable.
These types of models are called supervised learning models because your test dataset is labeled with the right ‘answers’ for the model to learn from. You can say that you are supervising the model during its learning process.
Some popular supervised learning models are:
- Linear/Multiple/Polynomial regression
- Decision Tress/Random Forest
- Support vector machines
Unsupervised learning models
Not all the datasets have a clearly defined dependent variable which means we cannot supervise our ML model. In such cases, we present our ML model with some variables and it figures out what to do with it. We do not present the model with correct results (i.e. dependent variable) during the training phase. Unsupervised learning models are used for classifying and for detecting associations. For example, data from IMDB can be used to figure out which movies a user might like based on other movies they have liked in the past. Another example of unsupervised learning models is grouping (clustering) data based on common features.
Some popular unsupervised learning models are:
- k-means clustering
- Apriori for association
In practice, I have seen supervised learning models being used more often than unsupervised learning models but you can clearly see the power of unsupervised learning models as they provide answers (groupings/associations) that you wouldn’t have thought of.