10
This is a Mixed Mode Class. **** note: this syllabus represents a tentative schedule of course topics, assignments, and due dates. All scheduling is subject to change.**** contact information instructor: PK Douglas, PhD email: [email protected] office: partnership 2, room 322 email is the most reliable way to contact me. objectives Introduce neural computation from the synaptic to the systems level Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data Evaluate recent literature at the intersection of neuroscience and machine learning Examine parallels between artificial and biological computation outcomes Appreciate how neurons process stimuli and communicate within neural ensembles Understand methods for imaging the brain at the systems level (e.g., EEG, fMRI) Learn methods for analyzing and modeling neuroimaging data including machine learning methods Appreciate the parallels between artificial and neural computing course description This course provides an overview of key concepts in neuroscience taking a bottom up approach starting with microscopic and moving through mesoscopic and systems level computing in the human brain. We will cover key methods using to measure functional activity in the human brain, and methods used to analyze these data including basic and more state-of-the-arte machine learning techniques. Finally, we will examine parallels between artificial neural networks & biological neural networks. required texts & software There is no official required text. I will post readings online. Matlab (or similar) software may be useful for the final project. course number: IDS 6938 schedule: Tue 3:00 – 5:50 brief description: A study in various methods for modeling neuronal data from synapse to systems level, and parallels with machine learning methods.

learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

This is a Mixed Mode Class. **** note: this syllabus represents a tentative schedule of course topics, assignments, and due dates. All scheduling is subject to change.****

contact information instructor: PK Douglas, PhD email: [email protected] office: partnership 2, room 322 email is the most reliable way to contact me.

objectives • Introduce neural computation from the synaptic to the systems level • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate recent literature at the intersection of neuroscience and machine learning • Examine parallels between artificial and biological computation

outcomes • Appreciate how neurons process stimuli and communicate within neural ensembles • Understand methods for imaging the brain at the systems level (e.g., EEG, fMRI) • Learn methods for analyzing and modeling neuroimaging data including machine learning methods • Appreciate the parallels between artificial and neural computing course description This course provides an overview of key concepts in neuroscience taking a bottom up approach starting with microscopic and moving through mesoscopic and systems level computing in the human brain. We will cover key methods using to measure functional activity in the human brain, and methods used to analyze these data including basic and more state-of-the-arte machine learning techniques. Finally, we will examine parallels between artificial neural networks & biological neural networks. required texts & software • There is no official required text. I will post readings online. • Matlab (or similar) software may be useful for the final project.

course number: IDS 6938 schedule: Tue 3:00 – 5:50 brief description: A study in various methods for modeling neuronal data from synapse to systems level, and parallels with machine learning methods.

Page 2: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

The calendar is color coded according to the following scheme: Unit 1: Neural Computing at the Microscopic Level Unit 2: Systems Level Neuroscience Unit 3: Machine Learning Unit 4: Parallels between in vivo and in silico computing Date Topics Assignments Mode (*subject to

change) 1/9/18 • Course overview

• Introduction to neuroscience & data modeling

• Syllabus Quiz • Post introduction of yourself

In person

1/16/18 • Introduction to Ion Channels • Hodgkin – Huxley Gating

• Read Hodgkin Huxley Paper

In person

1/23/18 • The Neuron as an electric circuit • Memory & plasticity at the synapse • Cable Theory

• Post Selection of Neuroscience Paper to review for midterm

In person

1/30/18 • Origins of the EEG Signal • Time-Frequency Decompositions

• Read Buszaki or Valdez-Sosa paper & post in online discussion

TBD

2/6/18 • Intro to Systems Level Neuroscience • The Human Visual System

• Quiz on Systems Neuroscience

TBD

2/13/18 • Memory & Spatial Navigation in the human hippocampus

• The Importance of Dreams in Memory Consolidation

• Read one of the selected memory papers (e.g., Kumaran) and post in discussion forum

TBD

2/20/18 • Introduction to MRI & functional MRI

• Quiz on MRI data

TBD

2/27/18 • Midterm presentation

• Oral /written presentation reviewing a recent paper in neuroscience and/or artificial intelligence

In person

3/6/18 • Introduction to Machine Learning • Linear Discriminants & Support Vector

Machines

• Linear Discriminant Quiz

TBD

3/13/18 • SPRING BREAK

• NO CLASS

3/20/18 • Interpreting Machine Learning Output • Perceptrons and Neural Networks

• Weka Assignment

TBD

4/3/18 • Deep Neural Network Architectures • The Importance of Noise in Machine

Learning

• Read ‘Dropout’ paper or Kriegeskorte Review & post in forum

TBD

Page 3: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

4/10/18 • Reinforcement Learning • Generative Adversarial Networks • Parallels between Artificial & Neural

computing

• Read Hassabis paper & post in forum

TBD

4/17/18 • Final Project Presentations

• Oral Presentation of Final Project

In Person

4/24/18 STUDY DAY • NO CLASS

5/1/18 Final Individual Projects / Papers Due Summer Break! grading All discussion forums and quizzes are worth 5 points. The midterm presentation and write up are worth 15 points each. The final presentation and paper are worth 25 points each. Letter grading follows the traditional scale (e.g., an A is 90-100). I will periodically provide opportunities for bonus points – so check webcourses for such opportunities. changes & announcements It is possible that adjustments to the schedule may be made during the course. In the event that anything in this syllabus changes (e.g. classroom moves, changes in due dates, contact information), I will use a broad announcement so that all students will be informed immediately. It is critically important that you set your webcourse announcements so that you receive all notifications and be sure to check the class website regularly. academic integrity Students are encouraged to discuss problems with colleagues, but the final assignment handed in should be the student’s own work. Plagiarism and cheating of any kind on an examination, quiz, or assignment will result at least in an "F" for that assignment (and may, depending on the severity of the case, lead to an "F" for the entire course) and may be subject to appropriate referral to the Office of Student Conduct for further action. See the UCF Golden Rule for further information. I will also adhere to the highest standards of academic integrity, so please do not ask me to change your grade. accessibility The University is committed to providing reasonable accommodations for all persons with disabilities. This syllabus is available in alternate formats upon request. Students with disabilities who need accommodations in this course must contact the professor at the beginning of the semester to discuss needed accommodations. No accommodations will be provided until the student has met with the professor to request accommodations. Students who need accommodations must be registered with Student Accessibility Services, Ferrell Commons, 7F, Room 185, phone (407) 823-2371, TTY/TDD only phone (407) 823-2116, before requesting accommodations from the professor. copyright This course may contain copyright protected materials such as audio or video clips, images, text materials, etc. These items are being used with regard to the Fair Use doctrine in order to enhance the learning environment. Please do not copy, duplicate, download or distribute these items. The use of these materials is strictly reserved for this online classroom environment and your use only. All copyright materials are credited to the copyright holder.

Page 4: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

Instructions: Below are some selected papers that you may choose to present. The goal is to read these papers in detail, and develop a critical understanding of the research presented. You should develop a presentation that includes a background and introduction, and includes each of the figures and key conceptual arguments in the paper. In each of the categories, I have tried to categorize papers into more Neuroscience (N), Computer Science & Deep Learning (CS), and papers that cross both domains Neuroscience & Deep Learning (N+CS). You are free to work with a partner.

Neuroscience (N) Papers Classic Cognitive Science Papers Newell (1973) “You can’t play 20 questions..” Brief Description: After many years of task based fMRI studies, this classic cognitive science paper has become popular once again. https://pdfs.semanticscholar.org/85a0/96908670cd83cacfdede9e11f2df2dc41c9b.pdf Poldrack (2006) “Can cognitive processes be inferred from neuroimaging data?” Brief Description: Reviews issues related to what inferences we can and cannot make using classic experimental designs used typically in fMRI studies. https://www.ncbi.nlm.nih.gov/pubmed/16406760 *** Essentially, any paper by Russ Poldrack’s group is an acceptable choice. Computational Neuroscience Auksztulewicz & Friston (2016) “Repetition suppression and its contextual determinants in predictive coding” Brief Description: Predictive coding is a theory which suggests that the brain a predictive organ which infers the probable causes of its incoming sensory information. This paper considers the effect of repetition suppression or habituation to a stimuli within this context. http://dx.doi.org/10.1016/j.cortex.2015.11.024 McDonnell & Ward (2011) The benefits of noise in neural systems: bridging theory and experiment (N) Brief Description: This paper discusses stochastic facilitation, or the unexpected advantages of noise, in boosting weak signals. Although this article focuses on communication between neurons – this theory applies to many fields where it is often referred to as stochastic resonance. http://www.nature.com/articles/nrn3061

Page 5: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

Friston & Buzsáki (2016) “The Functional Anatomy of Time: What and When in the Brain” http://dx.doi.org/10.1016/j.tics.2016.05.001 ** Essentially any paper by Friston’s group will be acceptable. Dayan (1994) “The Helmholtz Machine” Brief Description: This is actually a CS paper, but it has direct parallels with Predictive Coding theory of the brain – so I have grouped it here. It describes a method for machine learning in the absence of a teaching signal (so an unsupervised method). EEG Papers Buzsaki (2012) “The Origin of extracellular fields and currents – EEG, ECoG, LFP, and spikes” https://www.ncbi.nlm.nih.gov/pubmed/?term=Buzsáki+G%5BAuthor%5D++The+Origin+of+extracellular+fields+and+currents Watson, B, Ding, M. Buzsaki, G. “Temporal coupling of field potentials and action potentials in the neocortex.” https://www.ncbi.nlm.nih.gov/pubmed/29250852 ** Essentially any paper by Buszaki’s group will be acceptable. There are also some interesting papers on hippocampal spatial navigation. How do Neurons encode representations? Bays (2015) “Spikes not slots: noise in neural populations limits working memory” Brief Description: What limits the number of representations we can store in working memory? http://dx.doi.org/10.1016/j.tics.2015.06.004 Deneve & Machens (2016) “Efficient codes and balanced networks” Brief Description: How are excitation and inhibition balanced in the brain? https://www.nature.com/articles/nn.4243 Mehta (2007) Cortico-hippocampal interaction during up-down states and memory consolidation http://www.nature.com/articles/nn0107-13

Page 6: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

Tye,K.M.&Deisseroth,K.“Optogeneticinvestigationofneuralcircuitsunderlyingbraindiseaseinanimalmodels.”https://www.ncbi.nlm.nih.gov/pubmed/22430017 ** Essentially any paper by Kriegeskorte’s (See work on Representational Similarity Analysis) or the Karl Deisseroth’s group will be acceptable in this category as well. fMRI Tyszka et al. (2011) “Intact Bilateral Resting-State Networks in the Absence of the Corpus Callosum” Brief Description: A fun article!! Still very curious to me! http://www.jneurosci.org/content/31/42/15154 Cukur et al. (2013) “Functional Subdomains within Human FFA” Brief Description: A fresh take on facial recognition in the brain. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3797380/ Huth et al. (2016) “Decoding the Semantic Content of Natural Movies from Human Brain Activity” Brief Description: A decoding article from Jack Gallant’s group. Decoding movies from visual cortex… a fun and interesting paper. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5057448/ ** Below are a few papers on fMRI methodology. Task based studies (including encoding and decoding models) that you find are also acceptable. Wu et al. (2013) “A blind deconvolution approach to recover effective connectivity brain networks from resting state fMRI data.” Brief Description: Questions about the HRF as a valid proxy for fMRI-neural coupling are raised. https://www.ncbi.nlm.nih.gov/pubmed/23422254 Webb et al. (2013) “BOLD Granger Causality Reflects Vascular Anatomy” Brief Description: Questions about the HRF as a valid proxy for fMRI-neural coupling are raised. http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0084279

Page 7: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

Haufe,S.etal.(2014)“Ontheinterpretationofweightvectorsoflinearmodelsinmultivariateneuroimaging.“Brief Description: Issues with interpreting “decoding” or machine learning studies in fMRI are brought up – and “solved” in some cases. https://www.sciencedirect.com/science/article/pii/S1053811913010914

Computer Science & Deep Learning (CS) Convolutional Neural Networks Krizhevsky et al. (2012) “ImageNet Classification with Deep Convolutional Neural Networks” Brief Description: This is the “AlexNet” paper (CS) using convolutional neural networks https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf

Szegedy et al. (2015). Going Deeper with Convolutions Brief Description: The GoogLeNet Paper classifying images (CS) https://www.cv-foundation.org/openaccess/content_cvpr_2015/papers/Szegedy_Going_Deeper_With_2015_CVPR_paper.pdf

He et al. (2015) Deep Residual Learning for Image Recognition Brief Description: This is the Microsoft “ResNet” paper, deep learning with many layers. https://arxiv.org/pdf/1512.03385v1.pdf

Jaderberg et al. (2015) Spatial Transformer Networks Brief Description: Improving image classification using affine transformations. https://arxiv.org/pdf/1506.02025.pdf

Reinforcement Learning Silver et al. (2016) “Mastering the game of Go with deep neural networks and tree search” Brief Description: Title says it all. https://www.nature.com/articles/nature16961 Silver et al. (2017)

Page 8: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

“Mastering the game of Go without human knowledge” Brief Description: Learning the game of Go tabula rasa. Uses only reinforcement learning without the supervised learning step, and outperforms the original AlphaGo algorithm. https://www.nature.com/articles/nature24270 Generative Adversarial Networks Goodfellow et al. (2014) “Generative Adversarial Networks” Brief Description: The original paper. https://arxiv.org/abs/1406.2661

Reed et al. (2015) “Generative Adversarial Text to Image Synthesis” Brief Description: Automatic synthesis from text. Impressive results. http://proceedings.mlr.press/v48/reed16.pdf Isola et al. (2016, revised 2017) “Image –to-Image Translation with Conditional Adversarial Networks” Brief Description: Networks not only learn the mapping from input image to output image, but also learn a loss function to train this mapping. https://arxiv.org/abs/1611.07004

Zhu et al. (2017) “Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks” BriefDescription:We present an approach for learning to translate an image from a source domain X to a target domain Y in the absence of paired examples. https://arxiv.org/abs/1703.10593 Natural Language Processing Murdoch et al. (2018) “Beyond Word Importance: Contextual Decomposition to Extract Interactions from LSTMs” Brief Description: Long Short Term Memory (LSTM) networks are recurrent neural networks capable of learning long term dependencies. Here, they are applied to extract meaningful information from language context. https://arxiv.org/abs/1801.05453

Chen et al. (2017) “Reading Wikipedia to Answer Open-Domain Questions” Brief Description: Addresses challenges of document retrieval and finding answers to questions using Wikipedia articles. https://arxiv.org/abs/1704.00051

Page 9: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

Li et al. (2017) “Adversarial Learning for Neural Dialogue Generation” Brief Description: We propose using adversarial training for open-domain dialogue generation: the system is trained to produce sequences that are indistinguishable from human-generated dialogue utterances. We cast the task as a reinforcement learning (RL) problem where we jointly train two systems, a generative model to produce response sequences, and a discriminator---analagous to the human evaluator in the Turing tests. https://arxiv.org/abs/1701.06547

Collobert et al. (2011) Natural Language Processing (Almost) from Scratch Brief Description: An older paper now, but still frequently cited. http://www.jmlr.org/papers/volume12/collobert11a/collobert11a.pdf

Interpreting Deep Learning

Montavon et al. (2017) “Methods for Interpreting and Understanding Deep Neural Networks” Brief Description: A new method for interpreting the importance of various nodes in the neural network in terms of making outcome decisions. https://arxiv.org/pdf/1706.07979.pdf

Parallels Between Neuroscience & Deep Learning (N+CS)

Parallels Between the Human Visual System & Computer Vision Kriegeskorte (2015) “Deep Neural Networks:�A New Framework for Modeling Biological Vision and Brain Information Processing” Brief Description: The brain is a massive recurrent convolutional neural network. A great review drawing parallels with deep learning and human vision (mostly N). http://www.annualreviews.org/doi/abs/10.1146/annurev-vision-082114-035447 Human Memory & Learning in Neural Networks Kumaran et al. (2016)

Page 10: learning methods **** note: this syllabus represents a ...gila.bioe.uic.edu/~samira/docs/Syllabus.pdf · • Review methods for analyzing neuroimaging (e.g., EEG, fMRI) data • Evaluate

“What Learning Systems do Intelligent Agents Need? Complementary Learning Systems Theory Updated” Brief Description: How does gradual learning translate into the ability to generalize this information to new situations? Comparisons between the human hippocampus and machine learning are made, and catastrophic inference is discussed. http://dx.doi.org/10.1016/j.tics.2016.05.004 Marblestone et al. (2016) “Towards an integration of deep learning and neuroscience” Brief Description: How cognitive architectures in the brain relate to artificial neural networks. Also discusses the issues of how we learn as infants in the absence of a teaching signal. doi: http://dx.doi.org/10.1101/058545 Buesing et al. (2011) “Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons” Brief Description: Attempts to understand how neuronal population firing may represent distributions related to sensory input and memory, given temporally unreliable neuronal firing probabilities (e.g., that result from refractory periods). doi:10.1371/journal.pcbi.1002211 Pouget et al. (2013) “Probabilistic brains: knowns and unknowns” Brief Description: Probabilistic inference for multisensory integration doi:10.1038/nn.3495

Hassabis et al. (2017) Neuroscience-Inspired Artificial Intelligence Brief Description: From the team that developed AlphaGo/Zero, here is a perspectives piece on how neuroscience will continue to inspire AI moving forward. http://www.cell.com/neuron/pdf/S0896-6273(17)30509-3.pdf