Introduction to Neural Networks
U. of Minnesota
Mid-term Study Guide, Spring, 1998

The mid-term will cover material in the Anderson book, chapters 1-9.

Sample short answer questions
Define and describe the relation of the following key words or phrases to neural networks.
(8 items drawn from set below; 3 points each).

eigenvector linear associator autoassociator synaptic modification
Hebbian heteroassociation coarse coding spontaneous activity
leaky integrator perceptron projective field BSB
dendrite classical conditioning receptive field lateral inhibition
spike linear independence grandmother cell XOR
McCulloch-Pitts distinctive features cross-correlation supervised learning
recurrent inhibition pseudoinverse gradient descent symmetric matrix
WTA least-squares linear system orthogonality
outer product learning Widrow-Hoff error correction topographic representation generic neural network neuron

Sample essay questions
(choice of 2 essays drawn from a subset of about 6 of those listed below; 12 points each).

Describe the anatomy of the generic neuron, and the mechanisms involved in the generation of an action potential .

Discuss Anderson's model of either auto- or hetero-associative learning. Give one example of its application..

What is the Perceptron model? Discuss both its successes, failures and impact on the field of neural network research.

Describe a neural network model for the lateral eye of the limulus. Relate a dynamical model to possible retrieval mechanisms for autoassociation.

Discuss the pros and cons of distributed vs. localized representations with examples from theoretical considerations and neurophysiology.

Contrast "connectionist" computational schemes with traditional serial computing.

Problem
(One problem, 3 to 6 points)

You should be able to compute inner and outer products, multiply matrices and vectors, calculate the transpose, know how to find eigenvectors and eigenvalues, measure the "similarity" between vectors, and find the inverse of small (e.g. 2x2) matrices.