Answer Posted / Prateek Sharma
No, the ReLU (Rectified Linear Unit) activation function should not be used in the output layer of a neural network. This is because the ReLU activation function produces zero values for negative input, which can lead to problems when training certain types of models like logistic regression and linear classification. Instead, sigmoid or softmax functions are typically used in the output layer.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category