Answer Posted / Vartika Gupta
Softmax function: The softmax function is used in the output layer of a neural network for multi-class classification problems. It converts a vector of arbitrary real values into a vector of probabilities by normalizing each value to sum up to 1. ReLU (Rectified Linear Unit) function: The ReLU function is an activation function commonly used in hidden layers of deep neural networks. Unlike sigmoid or tanh functions, it does not saturate and produces output values that can be positive, zero, or negative.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category