Sample exam questions for CSC785-Introduction to neural
The exam will include:
An essay type question (e.g. question 4 below). This question will require
a detailed theoretical answer. Please treat the subject as carefully as
possible in a comprehensive manner (15min).
Three problems. The problems will be very similar to the problems assigned
as homeworks (15min each).
A set of brief questions such as multiple choice questions (10 min
for the whole set).
The problems below might be similar to the problems that will be given
in the exam. Please use them together with the exercises in the book to
prepare for the exam.
Describe 4 learning rules of your choice. For each learning rule
Discuss the main categories of neural network applications. Describe each
type and discuss various issues as appropriate. Discuss advantages and
disadvantages of each type of application.
the type of learning (e.g. supervised, unsupervised, etc.)
the weight changing rule
advantages and disadvantages
what the rule achieves
Describe the X neural network paradigm. Address the following:
Discuss training and generalization issues in neural network learning.
Address issues like the number of weights, the number of patterns, etc.
and relate them to the probability of implementing a function that offers
a good generalization.
Type of learning
Advantages and disadvantages
Training and generalization issues
Compare X and Y neural network paradigms.
Compare A and B learning rules.
What are the main advantages and disadvantages of neural networks with
respect to other machine learning techniques (e.g. expert systems, formal
logic deduction, automatic proving systems, etc.)?
What are the main disadvantages and potential problems of the standard
Describe enhancement W of standard backpropagation.
Briefly describe the main enhancements of the standard backpropagation.
Compare enhancements W and Z of standard backpropagation.
Describe how CBD will solve the problem above. Write the expression of
the solution. Draw several pictures explaining how the training proceeds.
X and Y can be (without limitation): standard backpropagation,
Widrow-Hoff, perceptron, supervised Hebbian, constraint based decomposition,
A and B can be (without limitation): hebbian learning, delta rule,
Widrow-Hoff rule, etc.
W and Z can be (without limitation): momentum, Newton's method, conjugate
gradient, variable learning rate, simulated annealing, etc.