Introduction to Neural Networks
U. of Minnesota, Final Study Guide
Psy 5038
Spring, 1998
The final will cover material in the Anderson book, chapters 9, 10,
12, 13 (very little), 14, 15.
There may be some material from the first half of the quarter, but the final
exam
will be heavily weighted towards the second half of the quarter.
Sample short answer questions
Define and describe the relation of the following key words or phrases
to neural networks.
(8 items drawn from set below; 3 points each).
"Energy" | attractor | pseudoinverse | BSB |
autoassociator | topographic representation | grandmother cell | asynchronous update |
Content addressable memory | Oja's rule | principal components analysis | sparse distributed representation |
constraint satisfaction | nearest-neighbor classifier | diameter-limited | correspondence problem |
gradient descent | Lyapanov function | encoder network | topology-preserving map (Kohonen) |
simulated annealing | cortical maps | generalized delta rule | Widrow-Hoff error correction |
XOR | Hopfield's continuous response model | Gibbs G measure | anti-Hebbian |
perceptron learning rule | random dot stereogram | prototype/exemplar | local minimum |
Sample essay questions
(choice of 2 essays drawn from a subset of those listed below; 12 points
each).
Give an account of Hopfield's 1984 graded response neural network model.
How does it relate to the discrete stochastic model of 1982?
Describe Marr and Poggio's 1976 stereo algorithm, and compare it to Hopfield's 1982 neural network.
How does the error back-propagation model work (you don't need to derive the learning rule)? What are the pros and cons of this learning algorithm?
Describe the Boltzmann machine learning algorithm (you need not derive the learning rule). What are the pros and cons of this learning algorithm?
Give an account of just one of the following approaches to self-organization: Kohonen, 1982 (Chapter 14); or principal components sub-spaces.
Describe Anderson's Brain-state-in-a-box model using one application.