clara


{ City } hyderabad
< Country > india
* Profession * student
User No # 61953
Total Questions Posted # 169
Total Answers Posted # 38

Total Answers Posted for My Questions # 81
Total Views for My Questions # 611439

Users Marked my Answers as Correct # 243
Users Marked my Answers as Wrong # 27
Answers / { clara }

Question { 7975 }

. Why are linearly separable problems of interest of neural network researchers?
a) Because they are the only class of problem that network can solve successfully
b) Because they are the only class of problem that Perceptron can solve successfully
c) Because they are the only mathematical functions that are continue
d) Because they are the only mathematical functions you can draw


Answer

b

Is This Answer Correct ?    9 Yes 1 No

Question { 7419 }

 Which of the following is not the promise of artificial neural network?
a) It can explain result
b) It can survive the failure of some nodes
c) It has inherent parallelism
d) It can handle noise


Answer

a

Is This Answer Correct ?    8 Yes 0 No


Question { 12198 }

Neural Networks are complex ______________ with many parameters.
a) Linear Functions
b) Nonlinear Functions
c) Discrete Functions
d) Exponential Functions


Answer

a

Is This Answer Correct ?    16 Yes 4 No

Question { 6459 }

A perceptron adds up all the weighted inputs it receives, and if it exceeds a certain value, it outputs a 1, otherwise it just outputs a 0.
a) True
b) False
c) Sometimes – it can also output intermediate values as well
d) Can’t say


Answer

a

Is This Answer Correct ?    6 Yes 0 No

Question { 4116 }

The name for the function in question 16 is
a) Step function
b) Heaviside function
c) Logistic function
d) Perceptron function


Answer

b

Is This Answer Correct ?    1 Yes 0 No

Question { 3838 }

Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results.
a) True – this works always, and these multiple perceptrons learn to classify even complex problems.
b) False – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do
c) True – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded
d) False – just having a single perceptron is enough


Answer

C

Is This Answer Correct ?    2 Yes 0 No

Question { 11638 }

The network that involves backward links from output to the input and hidden layers is called as ____.
a) Self organizing maps
b) Perceptrons
c) Recurrent neural network
d) Multi layered perceptron


Answer

c

Is This Answer Correct ?    14 Yes 0 No

Question { 6933 }

 Which of the following is an application of NN (Neural Network)?
a) Sales forecasting
b) Data validation
c) Risk management
d) All of the mentioned


Answer

d

Is This Answer Correct ?    15 Yes 1 No

Prev    1   2    [3]