What is back propagation?
a) It is another name given to the curvy function in the perceptron
b) It is the transmission of error back through the network to adjust the inputs
c) It is the transmission of error back through the network to allow weights to be adjusted so that the network can learn.
d) None of the mentioned
What are conjugate gradients, levenberg-marquardt, etc.?
How does an LSTM network work?
What are cases and variables?
. Why are linearly separable problems of interest of neural network researchers? a) Because they are the only class of problem that network can solve successfully b) Because they are the only class of problem that Perceptron can solve successfully c) Because they are the only mathematical functions that are continue d) Because they are the only mathematical functions you can draw
A 3-input neuron is trained to output a zero when the input is 110 and a one when the input is 111. After generalization, the output will be zero when and only when the input is: a) 000 or 110 or 011 or 101 b) 010 or 100 or 110 or 101 c) 000 or 010 or 110 or 100 d) 100 or 111 or 101 or 001
How does ill-conditioning affect nn training?
What learning rate should be used for backprop?
What are batch, incremental, on-line, off-line, deterministic, stochastic, adaptive, instantaneous, pattern, constructive, and sequential learning?
Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results. a) True – this works always, and these multiple perceptrons learn to classify even complex problems. b) False – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do c) True – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded d) False – just having a single perceptron is enough
Which of the following is true? (i) On average, neural networks have higher computational rates than conventional computers. (ii) Neural networks learn by example. (iii) Neural networks mimic the way the human brain works. a) All of the mentioned are true b) (ii) and (iii) are true c) (i), (ii) and (iii) are true d) None of the mentioned
How artificial neurons learns?
What is the role of activation functions in a Neural Network?
AI Algorithms (74)
AI Natural Language Processing (96)
AI Knowledge Representation Reasoning (12)
AI Robotics (183)
AI Computer Vision (13)
AI Neural Networks (66)
AI Fuzzy Logic (31)
AI Games (8)
AI Languages (141)
AI Tools (11)
AI Machine Learning (659)
Data Science (671)
Data Mining (120)
AI Deep Learning (111)
Generative AI (153)
AI Frameworks Libraries (197)
AI Ethics Safety (100)
AI Applications (427)
AI General (197)
AI AllOther (6)