Course home pages: courses.kersten.org
Instructor:
Daniel Kersten, kersten@umn.edu, Office: S212 Elliott Hall, Phone:
625-2589
Office hours: Mondays 11:00 to 12:00 and by appointment.
Teaching assistant:
Yijun Ge, gexxx119@umn.edu, N10 Elliott Hall
Wednesdays 11:00 to 12:00 and by appointment.
(Note: N10 is in a secured area. Please send email to TA to open door.)
Course description. Introduction to large scale parallel distributed processing models in neural and cognitive science. Topics include: linear models, statistical pattern theory, Hebbian rules, self-organization, non-linear models, information optimization, and representation of neural information. Applications to sensory processing, perception, learning, and memory.
Prerequisites: Linear algebra, multivariate calculus.
Lecture notes (see below)
Alternatives: Mathematica
is available in several labs on campus, go to http://www.oit.umn.edu/computer-labs/software/index.htm
You may wish to purchase Mathematica
for Students see http://www.wolfram.com/products/student/mathforstudents/index.html.
You can also access Mathematica on
the CLA servers:
If you never programmed before go here. If you have programming experience, go here.
For user help on using Mathematica, see: http://mathematica.stackexchange.com
Learning center: http://www.wolfram.com/learningcenter/
http://ipython.org
http://jupyter-notebook-beginner-guide.readthedocs.org/en/latest/index.html
http://www.scipy.org
For
an online course in using Python and PsychoPy for research in human
vision see:
http://nbviewer.ipython.org/github/gestaltrevision/python_for_visres/blob/master/index.ipynb
Gopen, G. D., & Swan, J. A., 1990. The Science of Scientific Writing. American Scientist, 78, 550-558. (pdf)
*Anderson,
James. (1995) Introduction to Neural Networks,
MIT Press.
***Bishop, C. M. (2006). Pattern recognition and machine
learning. New York: Springer.
*Dayan, P., & Abbott, L. F. (2001). Theoretical neuroscience
: computational and mathematical modeling of neural systems.
Cambridge, Mass.: MIT Press.
Freeman, J. A. (1994). Simulating Neural Networks with Mathematica
. Reading, MA: Addison-Wesley Publishing Company. http://library.wolfram.com/infocenter/Books/3485/
**Gershenfeld, N. A. (1999). The
nature of mathematical modeling. Cambridge
; New York: Cambridge University Press.
***Goodfellow, I., Bengio,
Y., & Courville, A. (2016). Deep Learning. MIT Press. (online)
**Hertz, J., Krogh, A., &;Palmer, R. G.
(1991). Introduction to the theory of neural
computation (Santa Fe Institute Studies in the Sciences of
Complexity ed.). Reading, MA: Addison-Wesley Publishing
Company.
There will be programming assignments, as well as a final project. The grade weights are:
http://onestop.umn.edu/calendars/#fall2016
(NOTE: Links to revised lecture material below will be
posted on the day of the lecture.
Links to the pdfs for additional readings
may require a password.
If you want to preview similar material, check
out lectures from 2014)
Lecture
notes are in Mathematica Notebook and pdf
format. You can download the Mathematica
notebook files below to view with Mathematica
or Wolfram CDF Player
(which is free).
Date |
Lecture |
Additional Readings & supplementary material |
Assignments |
|
1 |
5Sep |
Introduction
|
Mathematica
screencast |
|
2 |
Sep 10 |
The neuron (pdf file)| Mathematica notebook |
Hodgkin-Huxley.nb |
|
3 |
Sep 12 |
Neural Models, McCulloch-Pitt (pdf file)| Mathematica notebook |
Koch, C., & Segev, I. (Eds.). (1998) (pdf) |
|
4 |
Sep 17 |
Generic neuron model (pdf file)| Mathematica notebook |
Problem Set 1 | |
5 |
Sep 19 |
Lateral inhibition (pdf file)| Mathematica notebook |
Hartline (1972) (pdf) |
|
6 |
Sep 24 |
Matrices (pdf file)| Mathematica notebook |
||
7 |
Sep 26 |
Linear
systems, learning & memory (pdf file)| |
||
8 |
Oct 1 |
Linear recall, association and memory simulations (pdf file)| Mathematica notebook |
|
Problem Set 2 |
9 |
Oct 3 |
Overview of non-linear networks, discriminative models, Perceptron, SVMs (pdf file)| , Mathematica notebook |
|
|
10 |
Oct 8 |
Supervised learning as regression, Widrow-Hoff, backprop (pdf file)| Mathematica notebook |
XOR backpropagation demo. Mathematica notebook Poirazi, Brannon & Mel (2003) (pdf) Williams (1992) (pdf) |
|
11 |
Oct 10 |
Hopfield networks (pdf file)| Mathematica notebook |
Hopfield
(1982) (pdf) IPython demo of Hopfield Stereo correspondence. Mathematica demo. IPython demo. |
|
12 |
Oct 15 |
Boltzmann machine (pdf file)| Mathematica notebook |
Sculpting the energy function, interpolation (Mathematica notebook) Berkes, P., Orban, G., Lengyel, M., & Fiser, J. (2011). Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment. Science, 331(6013), 83. (pdf) |
Problem Set 3 |
13 |
Oct 17 |
Probability
and neural networks |
Griffiths
and Yuille (2006) (pdf)
Kersten D. & Yuille, A.L (2013) .Vision: Bayesian Inference and Beyond. The New Visual Neurosciences. John S. Werner and Leo M. Chalupa (Editors) MIT Press. Cambridge MA..(draft pdf) |
|
14 |
Oct 22 |
Multivariate
distributions, Regression, Interpolation, perceptual
completion |
Bias/Variance
notes |
|
15 |
Oct 24 |
Graphical models |
Pattern Recognition and Machine Learning, Chapter 8: Graphical Models. Christopher M. Bishop (pdf) Weiss
Y. (pdf)
|
|
16 |
Oct 29 |
Belief Propagation: regression and interpolation revisited (pdf ) Mathematica notebook (revised 10/31)
|
PROJECT GUIDELINES (pdf) SAMPLE
PROJECT IDEAS from previous years(pdf) For demonstration style projects, see the Wolfram Demonstration site How To Do Research. William T. Freeman (2013), (link) |
Problem Set 4
|
17 |
Oct
31 |
Utility
& probabiilty: Bayes decision
theory (pdf)
|
Geisler, W. S., & Kersten, D. (2002). Illusions, perception and Bayes. Nat Neurosci, 5(6), 508-510. (pdf) |
|
18 |
Nov 5
|
Supervised learning: neural networks in the context of machine learning
(pdf) |
|
|
19 |
Nov 7 |
More sampling Mathematica notebook (pdf) |
http://pymc-devs.github.io/pymc/ Recommended pymc tutorials: http://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/ Lake, B. M., Salakhutdinov, R., & Tenenbaum, J. B. (2015). Human-level concept learning through probabilistic program induction. Science, 350(6266), 1332–1338. http://doi.org/10.1126/science.aab3050 |
|
20 |
Nov 12 |
Overview of python/ipython for scientific compution/neural networks, and Bayesian computations. IPython notebook (using nbviewer) Raw notebook (needs viewer or jupyter to An older version at wakari.io:
|
Anaconda python installation recommended. We will use IPython, a browser-based notebook interface for python. See here for illustrations of IPython cell types, and here for a collection of sample notebooks. Look here for some good tips on installation, as well as the parent directory for excellent ipython-based course material on scientific computing using Monte Carlo methods. For a quick start to scientific programming, see: http://nbviewer.ipython.org/gist/rpmuller/5920182 For a list of some notebooks in psychology, neuroscience, and machine learning see: For a comphrensive coverage of scientific python see:https://scipy-lectures.github.io And for a ground-up set of tutorials on python see: http://learnpythonthehardway.org/book/ Switching from matlab to python? http://wiki.scipy.org/NumPy_for_Matlab_Users Practice: 1) TwoNeuroHopfield.py, 2) ConvolutionDemoFor5038.ipynb, which needs Zebra_running_Ngorongoro.jpg |
|
21 |
Nov 14 |
Lect_21_PyMC (raw IPython notebook) or Jupyter nbviewer PyMC2 sprinkler (raw) PyMC3 Gaussian mixtures (raw) PyMC3 spike rate transitions (raw) Scikit-learn gaussian mixtures (raw)
|
Carrandini, Heeger, Movshon (1996)(pdf) Supplement
(pdf
file)| Mathematica
notebook Kersten D. & Yuille, A.L. (2014) Inferential Models of the
Visual Cortical Hierarchy. The New Cognitive Neurosciences, 5th
Edition.(draft
pdf) |
Problem Set 5
|
22 |
Nov 19 |
Architectures: Overview of visual hierarchy: Lect_22a_VisualArchitecture(Keynote pdf)
|
|
Final project title & paragraph outline due (2%) |
23 |
Nov 21 |
Neural networks for self-organization AdaptMaps(pdf)| Mathematica notebook
|
Oja's rule and PCA: Sanger (2003) (pdf) Simoncelli,
E. P., & Olshausen, B. A.
(2001). Natural image statistics and neural representation. Annu
Rev Neurosci, 24, 1193-1216.(pdf)
|
|
24 |
Nov 26 |
Efficient coding. Mathematica notebook (pdf)
Neural population codes(Mathematica notebook) (pdf) Probabilistic
neural representations, Poisson-like codes, and the neural
integration of information. Keynote presentation
|
Knill & Pouget
(2004) (pdf)
Ma, W. J. (2012). Organizing probabilistic models of perception. Trends in Cognitive Sciences, 16(10), 511–518. (pdf) Quiroga,
R. Q., Reddy, L., Kreiman, G.,
Koch, C., & Fried, I. (2005).(pdf) |
|
25 |
Nov 28 |
Scientific writing and
presentations (pdf) |
Gopen & Swan, 1990 (pdf) |
|
26 |
Dec 3 |
Clustering,
EM,
segmentation |
Expectation Maximization: Weiss Y. (pdf) Kirchner, H., & Thorpe, S. J. (2006). Ultra-rapid object detection with saccadic eye movements: Visual processing speed revisited. Vision Research, 46(11), 1762–1776. doi:10.1016/j.visres.2005.10.002 (pdf) Ullman,
S., Vidal-Naquet, M., & Sali,
E. (2002). Visual features of intermediate complexity and
their use in classification. Nat Neurosci,
5(7), 682-687. (pdf) Serre,
T., Oliva, A., & Poggio,
T. (2007). A feedforward
architecture accounts for rapid categorization. Proc
Natl Acad
Sci U S A, 104(15), 6424-6429. |
Problem Set 6 |
27 |
Dec 5 |
Bidirectional hierarchical models keynote
presentation (pdf) |
Hong, H., Yamins, D. L. K., Majaj, N. J., & DiCarlo, J. J. (2016). Explicit information for category-orthogonal object properties increases along the ventral stream. Nature Neuroscience, 1–18. http://doi.org/10.1038/nn.4247 Bullier, J. (2001). Integrated model of visual processing. Brain Res Brain Res Rev, 36(2-3), 96-107. (pdf) Epshtein, B., Lifshitz, I., & Ullman, S. (2008). Image interpretation by a single bottom-up top-down cycle. Proceedings of the National Academy of Sciences of the United States of America, 105(38), 14298. (pdf) Fang, F., Boyaci, H., Kersten, D., & Murray, S. O. (2008). Attention-dependent representation of a size illusion in human V1. Curr Biol, 18(21), 1707-1712 (pdf) Yuille,
A., & Kersten, D. (2006). Vision as Bayesian inference:
analysis by synthesis? Trends Cogn
Sci, 10(7), 301-308.
(pdf) |
Complete Draft of Final Project (5%) |
|
Dec 10 |
Class presentations |
Peer comments on Final Project (5%)
|
|
Dec 12 Last day |
Class presentations |
Drafts returned with Instructor and peer comments December 12 |
||
Dec 20 |
End of Semester |
Submit Final Revised Draft of Project (28%) |
This course teaches you how to understand cognitive and perceptual aspects of brain processing in terms of computation. Writing a computer program encourages you to think clearly about the assumptions underlying a given theory. Getting a program to work, however, tests just one level of clear thinking. By writing about your work, you will learn to think through the broader implications of your final project, and to effectively communicate the rationale and results in words.
Your final project will involve: 1) a computer simulation and; 2) a 2000-3000 word final paper describing your simulation. For your computer project, you will do one of the following: 1) Devise a novel application for a neural network model studied in the course; 2) Write a program to simulate a model from the neural network literature ; 3) Design and program a method for solving some problem in perception, cognition or motor control. The results of your final project should be written up in the form of a short scientific paper, describing the motivation, methods, results, and interpretation. Your paper will be critiqued and returned for you to revise and resubmit in final form. You should write for an audience consisting of your class peers. You may elect to have your final paper published in the course's web-based electronic journal.
Completing the final paper involves 3 steps:
If you choose to write your program in Mathematica, your paper and program can be combined can be formated as a Mathematica notebook. See: Books and Tutorials on Notebooks.
Your paper will be critiqued and returned for you to revise and resubmit in final form. You should write for an audience consisting of your class peers.
Some Resources:
Student Writing Support: Center for Writing, 306b Lind Hall and satellite
locations (612.625.1893) http://writing.umn.edu.
Online Writing Center: http://writing.umn.edu/sws/visit/online/index.html
NOTE: Plagiarism, a form of scholastic dishonesty and a disciplinaryoffense, is described by the Regents as follows: Scholasticdishonesty means plagiarizing; cheating on assignments or examinations;engaging in unauthorized collaboration on academic work; taking,acquiring, or using test materials without faculty permission; submittingfalse or incomplete records of academic achievement; acting alone or incooperation with another to falsify records or to obtain dishonestlygrades, honors, awards, or professional endorsement; or altering,forging, or misusing a University academic record; or fabricating orfalsifying of data, research procedures, or data analysis.http://regents.umn.edu/sites/regents.umn.edu/files/policies/Student_Conduct_Code.pdf
NOTE: Sexual Assault and higher education: Training modules and
information
The Department of Psychology supports the efforts of the University of
Minnesota towards prevention of sexual assault. We encourage all
students to participate in the free online training that has been
established for undergraduate students and graduate students. The
training highlights pertinent issues regarding sexual assault,
including, but not limited to: defining healthy relationships, consent,
bystander intervention, and gender roles. Haven
(for undergraduate students under the age of 25) and Haven
Plus (for undergraduates over 25, graduate students, and
professional students) is the training available at no cost to
University of Minnesota students. Additionally, to learn more about how
you can help reduce sexual assault at the University of Minnesota,
please visit the Aurora
Center.
© 2016 Daniel Kersten, University of Minnesota Privacy Statement