Curse Of Dimensionality explained in simple words

Curse Of Dimensionality explained in simple words


Curse of dimensionality is an important concept in the field of machine learning.

I'm writing this article as a part of my learning.

Basic idea of the curse of dimensionality is that if we increase the number of features then our machine learning model increases its accuracy but at some point of time it reaches the optimal number of features where our model has its maximum accuracy and after that increasing features doesn't have much effect on our model's accuracy rather the model's performance slows down.

For example lets take a scenario where your wallet is lost , I'll give you 3 situations and answer yourself in which of these it is most easy to find.

1) You know that your wallet is lost on the road from your office building to the parking stand

2) Your wallet is lost in the ground

3) Wallet is lost in the building

It is easy to search and find your wallet on the road comparatively.

Now you can visualize it in terms of dimensions:

Road--> 1-D, Ground-->2-D, Building- 3-D


so as we increase the dimensions , the data becomes sparse or the sparsity increases hence making the problem complex, hence we find the need to reduce this curse of dimensionality as it has 2 major drawbacks:-

1)Decrease in model's performance

2)Increase in computation


To handle this we use

1)Feature Selection

2)Feature Extraction


Feature selection has

  • Forward Selection
  • Backward Elimination

Feature Extraction has

  • PCA - Principle Component Analysis
  • LDA - Linear Discriminant Analysis
  • tSNE


Thank you for reading!!

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics