What is data normalization?
What is Gradient Descent?
What do you understand by perceptron? Also, explain its type.
What do you mean by dropout?
What is the use of leaky relu function?
What deep learning was exactly?
What is relu function?
What are Vanishing and Exploding Gradients?
What are the prerequisites for starting in deep learning?
What is model capacity?
What do you understand by perceptron?
What is the softmax function?
What is the difference between Epoch, Batch and Iteration in Deep Learning?
What are the main benefits of mini-batch gradient descent?
Is 16gb of ram a lot?