Answer Posted / Mr.prateek Sachdeva
There are several types of activation functions used in deep learning, including: Sigmoid, Tanh (Hyperbolic Tangent), ReLU (Rectified Linear Unit), Leaky ReLU, Softmax, and Swish. Each activation function serves a unique purpose, such as introducing non-linearity to the model or improving convergence speed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category