Sample short answer questions
Define and describe the relation of the following key words or phrases to
neural networks. Provide examples where appropriate.
(8 items drawn from set below; 3 points each).
eigenvector | linear associator | autoassociator | synaptic modification |
Hebbian rule | heteroassociation | superposition | |
leaky (or forgetful) integrate and fire model | EPSP/IPSP | Hodgkin-Huxley model | summed vector memory |
dendrite | classical conditioning | operant conditioning | lateral inhibition |
spike | linear independence | grandmother cell | perceptron |
McCulloch-Pitts | distinctive features | cross-correlation | supervised learning |
recurrent inhibition | pseudoinverse | compartmental model | symmetric matrix |
WTA | least sum of mean squares | linear system | orthogonality |
Widrow-Hoff error correction | diameter-limited | linear discriminant | generative model |
outer product learning | perceptron learning rule | topic vs. stress positions | generic neural network neuron |
Sample essay questions
(choice of 2 essays drawn from a subset of those listed below; 12 points
each).
Describe the slow-potential model and discuss it in the context of current
views of the computational power of single neurons (see articles by Koch &
Segev, 2000; Poirazi, Brannon & Mel, 2003).
Discuss a linear model of either auto- or hetero-associative learning. Give one example of its application.
Describe a neural network model for the lateral eye of the limulus. Discuss the relationship between the performance of feedforward and feedback models of lateral inhibition. (See chapter by Hartline & Ratliff, 1972.)
Explain why XOR can't be computed with a simple TLU. Explain how it can be solved by using an augmented input to a TLU. Describe how a error-back propagation network can be configured to learn to solve the problem.
Problem
(One problem, 3 to 6 points)
You should be able to compute inner and outer products, multiply matrices and vectors, calculate the transpose, know how to find eigenvectors and eigenvalues, measure the "similarity" between vectors, and find the inverse of small (e.g. 2x2) matrices.