Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results.
a) True – this works always, and these multiple perceptrons learn to classify even complex problems.
b) False – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do
c) True – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded
d) False – just having a single perceptron is enough



Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each ..

Answer / clara

C

Is This Answer Correct ?    2 Yes 0 No

Post New Answer

More AI Neural Networks Interview Questions

How many kinds of kohonen networks exist?

0 Answers  


What is Pooling in CNN and how does it work?

0 Answers  


How to avoid overflow in the logistic function?

0 Answers  


What learning rate should be used for backprop?

0 Answers  


List some commercial practical applications of artificial neural networks?

0 Answers  






Are neural networks helpful in medicine?

0 Answers  


What can you do with an nn and what not?

0 Answers  


A perceptron adds up all the weighted inputs it receives, and if it exceeds a certain value, it outputs a 1, otherwise it just outputs a 0. a) True b) False c) Sometimes – it can also output intermediate values as well d) Can’t say

1 Answers  


What is artificial intelligence neural networks?

0 Answers  


What is a Neural Network?

0 Answers  


What is a neural network and what are some advantages and disadvantages of such a network?

0 Answers  


What is simple artificial neuron?

0 Answers  


Categories