How many types of activation function are available?
Answer / Mr.prateek Sachdeva
There are several types of activation functions used in deep learning, including: Sigmoid, Tanh (Hyperbolic Tangent), ReLU (Rectified Linear Unit), Leaky ReLU, Softmax, and Swish. Each activation function serves a unique purpose, such as introducing non-linearity to the model or improving convergence speed.
| Is This Answer Correct ? | 0 Yes | 0 No |
In which layer softmax activation function used?
What are the applications of deep learning?
What do you understand by perceptron?
What are the main differences between ai, machine learning, and deep learning?
What is the use of leaky relu function?
Which laptop is best for research?
Do you think that deep network is better than a shallow one?
Tell me how does deep learning contrast with other machine learning algorithms?
What was deep learning?
Is rtx 2060 good for deep learning?
What is data normalization and why do we need it?
Is gtx 1060 good for deep learning?
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)