Introduction to Neural Networks
U. of Minnesota, Final Study Guide
Psy 5038

Fall, 2009

The final exam will primarily cover material from the second half of the semester.

Sample short answer questions
Define and describe the relation of the following key words or phrases to neural networks. Provide examples where appropriate.(Answer 8 out of 12 items drawn from set below; 3 points each).

"Energy" attractor EM bias/variance dilemma
autoassociator topographic representation grandmother cell asynchronous update
Content addressable memory Oja's rule principal components analysis sparse distributed representation
constraint satisfaction nearest-neighbor classifier "explaining away" correspondence problem
gradient descent Lyapanov function encoder network topology-preserving map (Kohonen)
simulated annealing cortical maps loss and risk functions Bayes net & probability factorization
MAP estimation Hopfield's continuous response model

Gibbs G measure

(Kullback-Leibler distance)

anti-Hebbian
spontaneous activity projective field receptive field coarse coding
marginalization & conditioning radial basis function prototype/exemplar  local minimum


Sample essay questions
(choice of 2 essays drawn from a subset of those listed below; 12 points each).

Discuss the pros and cons of distributed vs. localized representations with examples from theoretical considerations and neurophysiology (e.g. Sanger, 2003; Quiroga et al., 2005).

Describe Hopfield's 1984 graded response neural network model. How can it be used for memory? How does it relate to the discrete stochastic model of 1982?

Describe the Boltzmann machine algorithm for both recall using annealing, and for learning (you need not derive the learning rule). What are the pros and cons of this learning algorithm?

Give an account of just one of the following approaches to self-organization: a) Kohonen, 1982; or b) principal components for sub-space extraction.

Explain how the Kalman filter can be used to model neural processing or behavior (Rao & Ballard 1999 or Wolpert et al. 1995).

Discuss how probabilistic information might be represented in a population of neurons, and how that information could be used for optimal inference (Pouget et al. 2006; and Knill and Pouget, 2004).

What is a mixture model? How can EM be used to estimate the parameters of the model?