Course Content

  • 2_6_Normalizing_the_Data

Course Content


Normalization assigns equal weights/importance to each variable, ensuring that no single variable biases model performance in one direction simply because it is larger. Clustering algorithms, for example, employ distance measures to determine if an observation belongs to a specific cluster.

Normalization is required to ensure that the table contains only data that is directly related to the primary key, that each data field contains only one data element, and that redundant (duplicated and superfluous) data is removed.

Better execution is ensured, which is related to the previous point. As information bases become smaller in size, the processing of the information becomes faster and more confined, boosting reaction time and speed.

Linear normalization (max – min) is the optimum normalization technique. It's by far the simplest, most adaptable, and intuitive.

Recommended Courses

Share With Friend

Have a friend to whom you would want to share this course?

Download LearnVern App

App Preview Image
App QR Code Image
Code Scan or Download the app
Google Play Store
Apple App Store
598K+ Downloads
App Download Section Circle 1
4.57 Avg. Ratings
App Download Section Circle 2
15K+ Reviews
App Download Section Circle 3
  • Learn anywhere on the go
  • Get regular updates about your enrolled or new courses
  • Share content with your friends
  • Evaluate your progress through practice tests
  • No internet connection needed
  • Enroll for the webinar and join at the time of the webinar from anywhere