What is data normalization and why do we need it?
Answer / Lalit Prasad Maurya
Data Normalization (or Standardization) is a preprocessing step in machine learning where the values of features are scaled or transformed to have a mean of zero and standard deviation of one. This is important because many machine learning algorithms perform better when their input data has comparable scales and reduces the effects of outliers, which can otherwise dominate the learning process.
| Is This Answer Correct ? | 0 Yes | 0 No |
What was deep learning?
What do you mean by deep learning?
What is the cost function?
Is rtx 2060 good for deep learning?
Differentiate supervised and unsupervised deep learning procedures.
What is the use of deep learning in today's age, and how is it adding data scientists?
What do you understand by perceptron?
How much gpu memory do I need?
Why are gpus good for deep learning?
What do you mean by dropout?
What do you understand by a convolutional neural network?
What do you understand by boltzmann machine?
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)