Introduction to Neural Networks
U. of Minnesota
Mid-term Study Guide, Fall, 2009

The mid-term will cover material in Lectures 1-13
You are encouraged to research the lecture notes and supplementary material
that may provide insight into the concepts and questions below.

Sample short answer questions
Define and describe the relation of the following key words or phrases to neural networks. Provide examples where appropriate.
(8 items drawn from set below; 3 points each).

eigenvector linear associator autoassociator action potential synaptic modification
Hebbian rule heteroassociation superposition cable equation  
leaky (or forgetful) integrate and fire model EPSP/IPSP Hodgkin-Huxley model Mach bands summed vector memory
dendrite classical conditioning operant conditioning reinforcement learning lateral inhibition
spike linear independence grandmother cell generic neural network neuron delta-rule
McCulloch-Pitts TLU cross-correlation perceptron learning rule supervised learning
recurrent inhibition gradient descent compartmental model topic vs. stress positions symmetric matrix
Winner-take-all least sum of squares linear system outer product learning orthogonality
Widrow-Hoff error correction diameter-limited linear separability momentum generative model

Sample essay questions
(choice of 2 essays drawn from a subset of those listed below; 12 points each).

Describe the slow-potential model and discuss it in the context of current views of the computational power of single neurons (see articles by Koch & Segev, 2000; Poirazi, Brannon & Mel, 2003).

Discuss a linear model of either auto- or hetero-associative learning. Give one example of its application.

Describe a neural network model for the lateral eye of the limulus. Discuss the relationship between the performance of feedforward and feedback models of lateral inhibition. (See chapter by Hartline & Ratliff, 1972.)

Explain why XOR can't be computed with a simple TLU. Explain how it can be solved by using an augmented input to a TLU. Describe how a error-back propagation network can be configured to learn to solve the problem.

Problem
(One problem, 3 to 6 points)

You should be able to compute inner and outer products, multiply matrices and vectors, calculate the transpose, know how to find eigenvectors and eigenvalues, measure the "similarity" between vectors, and find the inverse of small (e.g. 2x2) matrices.