Course home pages: courses.kersten.org
Instructor: Daniel Kersten, kersten@umn.edu, Office:
S212 Elliott Hall, Phone: 625-2589
Office hours: Mondays 11:00 to 12:00 and by appointment.
Teaching
assistant: Cheng Qiu, qiuxx077@umn.edu
Office hours: Tuesdays 11:00 to 12:00
Course description. Introduction to large scale parallel distributed processing models in neural and cognitive science. Topics include: linear models, statistical pattern theory, Hebbian rules, self-organization, non-linear models, information optimization, and representation of neural information. Applications to sensory processing, perception, learning, and memory.
Prerequisites: Linear algebra, multivariate calculus.
Lecture notes (see below)
Mathematica is the primary programming environment for this course. If you wish to purchase Mathematica for Students see http://www.wolfram.com/products/student/mathforstudents/index.html).
Accessing Mathematica on the CLA servers:
For user help on using Mathematica, see: http://mathematica.stackexchange.com
Learning center: http://www.wolfram.com/learningcenter/
Gopen, G. D., & Swan, J. A., 1990. The Science of Scientific Writing. American Scientist, 78, 550-558. (pdf)
THE CENTER FOR WRITING offers free one-to-one writing assistance to
undergraduate and graduate students, with appointments up to 45 minutes.
Nonnative speaker specialists are available. For more information, see http://writing.umn.edu
*Anderson, James. (1995) Introduction to Neural Networks, MIT Press.
***Bishop, C. M. (2006). Pattern recognition and machine learning. New
York: Springer.
*Dayan, P., & Abbott, L. F. (2001). Theoretical neuroscience
: computational and mathematical modeling of neural systems. Cambridge,
Mass.: MIT Press.
Freeman, J. A. (1994). Simulating Neural Networks with Mathematica . Reading,
MA: Addison-Wesley Publishing Company. http://library.wolfram.com/infocenter/Books/3485/
**Gershenfeld, N. A. (1999). The nature of mathematical modeling. Cambridge
; New York: Cambridge University Press.
**Hertz, J., Krogh, A., &;Palmer, R. G. (1991). Introduction to the theory of neural computation (Santa Fe
Institute Studies in the Sciences of Complexity ed.). Reading, MA:
Addison-Wesley Publishing Company.
Koch, C., & Segev, I. (Eds.). (1998). Methods in
Neuronal Modeling : From Ions to Networks (2nd ed.).
Cambridge, MA: MIT Press.
***MacKay, D. J. C. (2003). Information theory,
inference, and learning algorithms. Cambridge, UK ; New York: Cambridge University Press. http://www.inference.phy.cam.ac.uk/mackay/itila/book.html
***Murphy, K. P. (2012). Machine Learning: a Probabilistic Perspective. MIT Press.
*Neural/Cognitive Science
**Physics/Applied Math
***Statistical/machine learning
There will be a mid-term, final examination, programming assignments, as well as a final project. The grade weights are:
http://onestop.umn.edu/calendars/#fall2014
(NOTE: Links to revised lecture material below will be posted on
the day of the lecture.
Links to the pdfs for additional readings may require
a password.
If you want to preview similar material, check
out lectures from 2009)
Lecture notes are in Mathematica Notebook and pdf format. You can download the Mathematica notebook files below to view with Mathematica or Wolfram CDF Player (which is free).
Date |
Lecture |
Additional Readings & supplementary material |
Assignments |
|
1 |
Sep 3 |
Introduction |
Mathematica screencast |
|
2 |
Sep 8 |
The neuron (pdf file)| Mathematica notebook |
Hodgkin-Huxley.nb |
|
3 |
Sep 10 |
Neural Models, McCulloch-Pitt (pdf file)| Mathematica notebook |
Koch, C., & Segev, I. (Eds.). (1998) (pdf) |
|
4 |
Sep 15 |
Generic neuron model (pdf file)| Mathematica notebook |
||
5 |
Sep 17 |
Lateral inhibition (pdf file)| Mathematica notebook |
Hartline (1972) (pdf) |
|
6 |
Sep 22 |
Matrices (pdf file)| Mathematica notebook |
Homework 1. Mathematica notebook | |
7 |
Sep 24 |
Linear
systems, learning & memory (pdf file)| |
||
8 |
Sep 29 |
Linear association and memory simulations (pdf file)| Mathematica notebook |
|
|
9 |
Oct 1 |
Non-linear networks, discriminative models, Perceptron, SVMs (pdf file)| , Mathematica notebook |
|
|
10 |
Oct 6 |
Supervised learning as regression, Widrow-Hoff, backprop (pdf file)| Mathematica notebook |
XOR backpropagation demo. Mathematica notebook Poirazi, Brannon & Mel (2003) (pdf) Williams (1992) (pdf) |
|
11 |
Oct 8 |
Hopfield networks (pdf file)| Mathematica notebook |
Hopfield
(1982) (pdf) Ipython demo of Hopfield Stereo correspondence. Mathematica demo. |
|
12 |
Oct 13 |
Boltzmann machine (pdf file)| Mathematica notebook |
Sculpting the energy function, interpolation (Mathematica notebook) Berkes, P., Orban, G., Lengyel, M., & Fiser, J. (2011). Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment. Science, 331(6013), 83. (pdf) |
Homework 2. Lateral inhibition, Widrow-Hoff, back-prop |
13 |
Oct 15 |
Probability and
neural networks |
ProbabilityOverviewNN.nb Kersten D. & Yuille, A.L (2013) .Vision: Bayesian Inference and Beyond. The New Visual Neurosciences. John S. Werner and Leo M. Chalupa (Editors) MIT Press. Cambridge MA..(draft pdf) |
|
14 |
Oct 20 |
Probability
continued |
Pattern
Recognition and Machine Learning, Chapter 8: Graphical Models.
Christopher M. Bishop (pdf) |
|
15 |
Oct 22 |
Regression, Interpolation, perceptual completion, bias/variance |
Weiss Y. (pdf) |
PROJECT
IDEAS (pdf) For demonstration style projects, see the Wolfram Demonstration site. A specific example. |
|
Oct 27 |
MID-TERM TEST | MID-TERM (16%) |
|
16 |
Oct 29 |
Belief Propagation: regression and interpolation revisited (pdf )
|
Geisler,
W. S., & Kersten, D. (2002). Illusions, perception and Bayes. Nat Neurosci, 5(6), 508-510. (pdf) |
|
17 |
Nov 3
|
Utility
& probabiilty: Bayes decision theory (pdf) Neural networks in the context of machine learning |
How To Do Research. William T. Freeman (2013), (link) |
|
18 |
Nov 5 |
Supervised learning: neural networks in the context of machine learning cont'd
|
Anaconda python installation recommended. We will use IPython, a browser-based notebook interface for python. See here for illustrations of IPython cell types, and here for a collection of sample notebooks. Look here for some good tips on installation, as well as the parent directory for excellent ipython-based course material on scientific computing using Monte Carlo methods. For a quick start to scientific programming, see: http://nbviewer.ipython.org/gist/rpmuller/5920182 For a comphrensive coverage of scientific python see:https://scipy-lectures.github.io And for a ground-up set of tutorials on python see: http://learnpythonthehardway.org/book/ Switching from matlab to python? http://wiki.scipy.org/NumPy_for_Matlab_Users |
|
19 |
Nov 10 |
Overview of python/ipython for scientific compution/neural networks, and Bayesian computations. IPython notebook
|
Practice: 1) TwoNeuroHopfield.py, 2) convolution, which needs Zebra_running_Ngorongoro.jpg |
|
20 |
Nov 12 |
More sampling |
http://pymc-devs.github.io/pymc/ Recommended pymc tutorials: http://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/ |
Homework 3 |
21 |
Nov 17 |
PyMC (IPython notebook) |
Carrandini, Heeger, Movshon (1996)(pdf) Supplement (pdf file)| Mathematica notebook Kersten D. & Yuille, A.L. (2014) Inferential Models of the Visual Cortical Hierarchy. The New Cognitive Neurosciences, 5th Edition.(draft pdf) |
Final project title & paragraph outline due (2%) |
22 |
Nov 19 |
Architectures: Overview of visual hierarchy: Lect_22a_VisualArchitecture(Keynote pdf) Lect_22b_AdaptMaps(pdf)| Mathematica notebook
|
Simoncelli,
E. P., & Olshausen, B. A. (2001). Natural image
statistics and neural representation. Annu Rev Neurosci, 24, 1193-1216.(pdf) Oja's rule and PCA: Sanger (2003) (pdf)
|
|
23 |
Nov 24 |
keynote presentation (pdf) Neural networks for self-organization, efficient
coding,Principal Components
|
Knill & Pouget (2004) (pdf)
Quiroga, R. Q., Reddy, L., Kreiman, G., Koch, C., & Fried, I. (2005).(pdf) |
|
24 |
Nov 26 |
Probabilistic neural representations and the neural integration of information keynote presentation (pdf)
|
Ma, W. J. (2012). Organizing probabilistic models of perception. Trends in Cognitive Sciences, 16(10), 511–518. (pdf)
|
|
25 |
Dec 1 |
Scientific writing and presentations (pdf)
|
Gopen & Swan, 1990 (pdf) |
(no Homework 4 ) |
26 |
Dec 3 |
Clustering, EM, segmentation |
Expectation Maximization: Weiss Y. (pdf) Kirchner, H., & Thorpe, S. J. (2006). Ultra-rapid object detection with saccadic eye movements: Visual processing speed revisited. Vision Research, 46(11), 1762–1776. doi:10.1016/j.visres.2005.10.002 (pdf) Ullman,
S., Vidal-Naquet, M., & Sali,
E. (2002). Visual features of intermediate complexity and their use in
classification. Nat Neurosci, 5(7), 682-687. (pdf) Serre,
T., Oliva, A., & Poggio,
T. (2007). A feedforward architecture accounts for
rapid categorization. Proc Natl Acad Sci U S A, 104(15),
6424-6429. |
|
27 |
Dec 8 |
Bidirectional hierarchical models keynote presentation (pdf) |
Bullier, J. (2001). Integrated model of visual processing. Brain Res Brain Res Rev, 36(2-3), 96-107. (pdf) Epshtein, B., Lifshitz, I., & Ullman, S. (2008). Image interpretation by a single bottom-up top-down cycle. Proceedings of the National Academy of Sciences of the United States of America, 105(38), 14298. (pdf) Fang, F., Boyaci, H., Kersten, D., & Murray, S. O. (2008). Attention-dependent representation of a size illusion in human V1. Curr Biol, 18(21), 1707-1712 (pdf) Yuille,
A., & Kersten, D. (2006). Vision as Bayesian inference: analysis by
synthesis? Trends Cogn Sci,
10(7), 301-308. (pdf) Kalman notes (pdf) |
Complete Draft of Final Project (5%) Due December 8
|
Dec 10 Last day |
In-class Final Test |
FINAL STUDY GUIDE | Peer comments on Final Project (5%:) Due Friday December 12 FINAL TEST (16%) |
|
Dec 15 |
Drafts returned with Instructor comments December 15 |
|||
Dec 17 |
||||
Dec 18 |
End of Semester |
Submit Final Revised Draft of Project (28%) |
This course teaches you how to understand cognitive and perceptual aspects of brain processing in terms of computation. Writing a computer program encourages you to think clearly about the assumptions underlying a given theory. Getting a program to work, however, tests just one level of clear thinking. By writing about your work, you will learn to think through the broader implications of your final project, and to effectively communicate the rationale and results in words.
Your final project will involve: 1) a computer simulation and; 2) a 2000-3000 word final paper describing your simulation. For your computer project, you will do one of the following: 1) Devise a novel application for a neural network model studied in the course; 2) Write a program to simulate a model from the neural network literature ; 3) Design and program a method for solving some problem in perception, cognition or motor control. The results of your final project should be written up in the form of a short scientific paper, describing the motivation, methods, results, and interpretation. Your paper will be critiqued and returned for you to revise and resubmit in final form. You should write for an audience consisting of your class peers. You may elect to have your final paper published in the course's web-based electronic journal.
Completing the final paper involves 3 steps:
If you choose to write your program in Mathematica, your paper and program can be combined can be formated as a Mathematica notebook. See: Books and Tutorials on Notebooks.
Your paper will be critiqued and returned for you to revise and resubmit in final form. You should write for an audience consisting of your class peers.
Some Resources:
Student Writing Support: Center for Writing, 306b Lind Hall and satellite locations
(612.625.1893) http://writing.umn.edu.
Online Writing Center: http://writing.umn.edu/sws/visit/online/index.html
NOTE: Plagiarism, a form of scholastic dishonesty and a disciplinaryoffense, is described by the Regents as follows: Scholasticdishonesty means plagiarizing; cheating on assignments or examinations;engaging in unauthorized collaboration on academic work; taking,acquiring, or using test materials without faculty permission; submittingfalse or incomplete records of academic achievement; acting alone or incooperation with another to falsify records or to obtain dishonestlygrades, honors, awards, or professional endorsement; or altering,forging, or misusing a University academic record; or fabricating orfalsifying of data, research procedures, or data analysis.http://regents.umn.edu/sites/regents.umn.edu/files/policies/Student_Conduct_Code.pdf
© 2014 Daniel Kersten, University of Minnesota