- by Edward Rolf Tufte

Edward Rolf Tufte, in his book “Beautiful Evidence,” presents us with six principles for giving the data in an informative way.

The six principles are:

  1. Showing Comparisons
  2. Showing causality, mechanism, explanation, systematic structure
  3. Performing Multivariate Analysis
  4. Integrating Evidence
  5. Describing the Evidence (Documentation)
  6. Content is King!

In his book, Edward Rolf Tufte uses Minard’s map of Russia's French invasion, created in 1869, to beautifully explain all these principles. Interested readers can check this book.

Preprocessing the data is one of the crucial steps of data analysis, one of the preliminary steps in that includes feature scaling. Often, programmers new to data science tend to neglect or bypass the step and directly go to analysing the data; this leads to bias and, in turn, influences the prediction accuracy.

Dog Image Credit: Tom&Sawyer

What is Data Normalization?

Data Normalization is a data preprocessing step where we adjust the scales of the features to have a standard scale of measure. In Machine Learning, it is also known as Feature scaling.

Why do we need Data Normalization?

Machine learning algorithms such as Distance-Based algorithms, Gradient Descent Based Algorithms expect the features to…

Nikhita Singh Shiv Kalpana

Masters’s in Analytics student at RMIT, Australia

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store