30
Generalized Linear Models (GLMs) II Statistical modeling and analysis of neural data NEU 560, Spring 2018 Lecture 10 Jonathan Pillow 1

Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

Generalized Linear Models (GLMs) II

Statistical modeling and analysis of neural dataNEU 560, Spring 2018

Lecture 10

Jonathan Pillow

1

Page 2: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

Summary:

k̂ = (XTX)�1XTY

1. “Linear-Gaussian” GLM: Y |X,~k ⇠ N (X~k,�2I)

2. Bernoulli GLM:

3. Poisson GLM:

yt|~xt,~k ⇠ Ber(f(~xt · ~k))

<latexit sha1_base64="UNt4lEq3y3m8KLk2pIT2tsWpp4c=">AAACbXicbZDNThsxFIWdofw0LRBA6qZQWY0QsCCaYdOyqITaTRcsqERKqkyIPM6dxIp/RvYdSjSaZ+Fpum3XfQseAWeYRQm9kq2j890rX58kk8JhGP5tBEsvlldW1142X71e39hsbW1/dya3HLrcSGN7CXMghYYuCpTQyywwlUi4SqZf5vzqBqwTRl/iLIOBYmMtUsEZemvYOo0Vwwlnsjgv6Sf6I0ab0ViaMU0Pe/HN9Ige06olSYuorGgNhq122Amros9FVIs2qetiuNXYi0eG5wo0csmc60dhhoOCWRRcQtmMcwcZ41M2hr6Xmilwg6L6Y0n3vTOiqbH+aKSV++9EwZRzM5X4zvm6bpHNzf+xfo7px0EhdJYjaP74UJpLiobOA6MjYYGjnHnBuBV+V8onzDKOPtZmM9bwkxulmB4VPpbSX8DptFwAtzW4XQA+0LK4jtFkpQ80WozvueiedE474bewffa5TnaNvCXvySGJyAdyRr6SC9IlnNyRX+Q3+dO4D94Eu8G7x9agUc/skCcVHDwAMYS9SQ==</latexit>

<latexit sha1_base64="HP+VfMMA0E+XGBPwmKdZYtaDvzk=">AAACcnicbZHLahsxFIbl6S11L7GTZSioNQUbGjPTTZJFSkg3WXThQl07eFyjkc84wroM0hknZpi3ydNkm2z6IN1HdgfaOj0g8ev/dJD0K8mkcBiGP2vBo8dPnj7bel5/8fLV6+1Gc+e7M7nl0OdGGjtMmAMpNPRRoIRhZoGpRMIgmX9e8cECrBNGf8NlBmPFZlqkgjP01qTxKVYMLziTxZeSHtPzGG1GY2lmNG0P48W8Q/dpO9o/7/wBflmxDp00WmE3XBd9KKJKtEhVvUmz9iaeGp4r0Mglc24UhRmOC2ZRcAllPc4dZIzP2QxGXmqmwI2L9UNL+t47U5oa64dGunb/7iiYcm6pEr9z9Sy3yVbm/9gox/RwXAid5Qia/z4ozSVFQ1ep0amwwFEuvWDcCn9Xyi+YZRx9tvV6rOGSG6WYnhY+mNJPwOm83ABXFbjaAD7asvgRo8lKH2i0Gd9D0f/YPeqGX8PWyWmV7BbZI+9Im0TkgJyQM9IjfcLJNbkht+Su9ivYC94G1TcEtapnl/xTwYd7gzC9GQ==</latexit>

<latexit sha1_base64="miR0nT8tJiZuVKiaUfnxELw/TEo=">AAAC03icbZFNb9NAEIY3Lh/FfKXlyGVFhJRKKLK5ALcKOHCqUom0leJgrdfrdJX9sHbHIdbWF8SVP8Kv4Vp+TdeJkcBlpJXenWde7c5MVgpuIYquB8Henbv37u8/CB8+evzk6fDg8MzqylA2o1poc5ERywRXbAYcBLsoDSMyE+w8W31o+fmaGcu1+gx1yRaSLBUvOCXgU+nwY50CvsLJmlG8SeHVTq1wYrnEiSRwaaSbam5tMy7Gf8pwQnMNXe3RUTocRZNoG/i2iDsxQl1M04PBJsk1rSRTQAWxdh5HJSwcMcCpYE2YVJaVhK7Iks29VEQyu3Dbdhv80mdyXGjjjwK8zf7tcERaW8vMV7YN2D5rk/9j8wqKtwvHVVkBU3T3UFEJDBq3s8M5N4yCqL0g1HD/V0wviSEU/ITDMFHsK9VSEpW7ZL1q3G4+TQ9sOrDpATBl474koMu+o+4cdQ+cGOlJ2wclwp00Pex/uey4X+L21oR+V3F/M7fF7PXk3SQ+jUbH77ul7aPn6AUaoxi9QcfoE5qiGaLoJ/qFrtHv4Cy4Cr4F33elwaDzPEP/RPDjBjNl52A=</latexit>

2

Page 3: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

stimulus filter Poissonspiking

stimulusk f λ(t)

conditional intensity(spike rate)

exponentialnonlinearity

Linear-Nonlinear-Poisson

3

Page 4: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

Fitting the nonlinearity

<latexit sha1_base64="IqiPS5IzX5l+5dK33I8KUFXVZ9E=">AAAB7HicbVBNT8JAEJ3iF+IX6tHLRmLiibRe1BvRi0dMLJBAQ7bLFla222Z3akIa/oMXD2q8+oO8+W9coAcFXzLJy3szmZkXplIYdN1vp7S2vrG5Vd6u7Ozu7R9UD49aJsk04z5LZKI7ITVcCsV9FCh5J9WcxqHk7XB8O/PbT1wbkagHnKQ8iOlQiUgwilZq9UYUybhfrbl1dw6ySryC1KBAs1/96g0SlsVcIZPUmK7nphjkVKNgkk8rvczwlLIxHfKupYrG3AT5/NopObPKgESJtqWQzNXfEzmNjZnEoe2MKY7MsjcT//O6GUZXQS5UmiFXbLEoyiTBhMxeJwOhOUM5sYQyLeythI2opgxtQBUbgrf88irxL+rXde/erTVuijTKcAKncA4eXEID7qAJPjB4hGd4hTcncV6cd+dj0Vpyiplj+APn8weYqo6q</latexit>

Filter k specifies a directionin stimulus space (i.e. a 1D subspace)

4

Page 5: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

1) Project onto subspace spanned by k

Fitting the nonlinearity

<latexit sha1_base64="flj8ApT0XXAd006blhKZPNAqW50=">AAACDHicbZC9TsMwFIUdfkv5CzCyWBQkpiphAQakChbGIhFaqYkix3VaK44T2U6lKskLsPAqLAyAWHkANt4Gt80ALUey9Once3V9T5AyKpVlfRtLyyura+u1jfrm1vbOrrm3/yCTTGDi4IQlohsgSRjlxFFUMdJNBUFxwEgniG4m9c6ICEkTfq/GKfFiNOA0pBgpbfnmsTsiGGZ+BK+gGwqEc3eIFIzKvChmVBSlbzaspjUVXAS7ggao1PbNL7ef4CwmXGGGpOzZVqq8HAlFMSNl3c0kSRGO0ID0NHIUE+nl02tKeKKdPgwToR9XcOr+nshRLOU4DnRnjNRQztcm5n+1XqbCCy+nPM0U4Xi2KMwYVAmcRAP7VBCs2FgDwoLqv0I8RDoTpQOs6xDs+ZMXwTlrXjbtO6vRuq7SqIFDcAROgQ3OQQvcgjZwAAaP4Bm8gjfjyXgx3o2PWeuSUc0cgD8yPn8AlRabeQ==</latexit>

<latexit sha1_base64="O9GuRVuYx2v+9x/1eESUfYxnZko=">AAAB/XicbVC7TsMwFHXKq5RXADGxWFRITFXCAmwVLIxFIrRSE0WO47RWHTuynYoqqsSvsDAAYuU/2Pgb3DQDtBzpSsfn3Cvfe6KMUaUd59uqrayurW/UNxtb2zu7e/b+wYMSucTEw4IJ2YuQIoxy4mmqGellkqA0YqQbjW5mfndMpKKC3+tJRoIUDThNKEbaSKF95I8Jho/Qx7HQsHzk4Si0m07LKQGXiVuRJqjQCe0vPxY4TwnXmCGl+q6T6aBAUlPMyLTh54pkCI/QgPQN5SglKijK9afw1CgxTIQ0xTUs1d8TBUqVmqSR6UyRHqpFbyb+5/VznVwGBeVZrgnH84+SnEEt4CwLGFNJsGYTQxCW1OwK8RBJhLVJrGFCcBdPXibeeeuq5d45zfZ1lUYdHIMTcAZccAHa4BZ0gAcwKMAzeAVv1pP1Yr1bH/PWmlXNHII/sD5/AIQHlL4=</latexit>

5

Page 6: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

projection onto uk

1) Project onto subspace spanned by k

Fitting the nonlinearity

<latexit sha1_base64="flj8ApT0XXAd006blhKZPNAqW50=">AAACDHicbZC9TsMwFIUdfkv5CzCyWBQkpiphAQakChbGIhFaqYkix3VaK44T2U6lKskLsPAqLAyAWHkANt4Gt80ALUey9Once3V9T5AyKpVlfRtLyyura+u1jfrm1vbOrrm3/yCTTGDi4IQlohsgSRjlxFFUMdJNBUFxwEgniG4m9c6ICEkTfq/GKfFiNOA0pBgpbfnmsTsiGGZ+BK+gGwqEc3eIFIzKvChmVBSlbzaspjUVXAS7ggao1PbNL7ef4CwmXGGGpOzZVqq8HAlFMSNl3c0kSRGO0ID0NHIUE+nl02tKeKKdPgwToR9XcOr+nshRLOU4DnRnjNRQztcm5n+1XqbCCy+nPM0U4Xi2KMwYVAmcRAP7VBCs2FgDwoLqv0I8RDoTpQOs6xDs+ZMXwTlrXjbtO6vRuq7SqIFDcAROgQ3OQQvcgjZwAAaP4Bm8gjfjyXgx3o2PWeuSUc0cgD8yPn8AlRabeQ==</latexit>

<latexit sha1_base64="O9GuRVuYx2v+9x/1eESUfYxnZko=">AAAB/XicbVC7TsMwFHXKq5RXADGxWFRITFXCAmwVLIxFIrRSE0WO47RWHTuynYoqqsSvsDAAYuU/2Pgb3DQDtBzpSsfn3Cvfe6KMUaUd59uqrayurW/UNxtb2zu7e/b+wYMSucTEw4IJ2YuQIoxy4mmqGellkqA0YqQbjW5mfndMpKKC3+tJRoIUDThNKEbaSKF95I8Jho/Qx7HQsHzk4Si0m07LKQGXiVuRJqjQCe0vPxY4TwnXmCGl+q6T6aBAUlPMyLTh54pkCI/QgPQN5SglKijK9afw1CgxTIQ0xTUs1d8TBUqVmqSR6UyRHqpFbyb+5/VznVwGBeVZrgnH84+SnEEt4CwLGFNJsGYTQxCW1OwK8RBJhLVJrGFCcBdPXibeeeuq5d45zfZ1lUYdHIMTcAZccAHa4BZ0gAcwKMAzeAVv1pP1Yr1bH/PWmlXNHII/sD5/AIQHlL4=</latexit>

2) take histogram of projected stimuli

6

Page 7: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

STA response

1) Project onto subspace spanned by k

Fitting the nonlinearity

<latexit sha1_base64="flj8ApT0XXAd006blhKZPNAqW50=">AAACDHicbZC9TsMwFIUdfkv5CzCyWBQkpiphAQakChbGIhFaqYkix3VaK44T2U6lKskLsPAqLAyAWHkANt4Gt80ALUey9Once3V9T5AyKpVlfRtLyyura+u1jfrm1vbOrrm3/yCTTGDi4IQlohsgSRjlxFFUMdJNBUFxwEgniG4m9c6ICEkTfq/GKfFiNOA0pBgpbfnmsTsiGGZ+BK+gGwqEc3eIFIzKvChmVBSlbzaspjUVXAS7ggao1PbNL7ef4CwmXGGGpOzZVqq8HAlFMSNl3c0kSRGO0ID0NHIUE+nl02tKeKKdPgwToR9XcOr+nshRLOU4DnRnjNRQztcm5n+1XqbCCy+nPM0U4Xi2KMwYVAmcRAP7VBCs2FgDwoLqv0I8RDoTpQOs6xDs+ZMXwTlrXjbtO6vRuq7SqIFDcAROgQ3OQQvcgjZwAAaP4Bm8gjfjyXgx3o2PWeuSUc0cgD8yPn8AlRabeQ==</latexit>

<latexit sha1_base64="O9GuRVuYx2v+9x/1eESUfYxnZko=">AAAB/XicbVC7TsMwFHXKq5RXADGxWFRITFXCAmwVLIxFIrRSE0WO47RWHTuynYoqqsSvsDAAYuU/2Pgb3DQDtBzpSsfn3Cvfe6KMUaUd59uqrayurW/UNxtb2zu7e/b+wYMSucTEw4IJ2YuQIoxy4mmqGellkqA0YqQbjW5mfndMpKKC3+tJRoIUDThNKEbaSKF95I8Jho/Qx7HQsHzk4Si0m07LKQGXiVuRJqjQCe0vPxY4TwnXmCGl+q6T6aBAUlPMyLTh54pkCI/QgPQN5SglKijK9afw1CgxTIQ0xTUs1d8TBUqVmqSR6UyRHqpFbyb+5/VznVwGBeVZrgnH84+SnEEt4CwLGFNJsGYTQxCW1OwK8RBJhLVJrGFCcBdPXibeeeuq5d45zfZ1lUYdHIMTcAZccAHa4BZ0gAcwKMAzeAVv1pP1Yr1bH/PWmlXNHII/sD5/AIQHlL4=</latexit>

2) take histogram of projected stimuli

3) ML estimate of Poisson rate in each bin is # spikes / # stimuli

projection onto uk

7

Page 8: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

<latexit sha1_base64="kzHLhkSGSdIE9++YRvFK95s20wQ=">AAAB/HicbVDLSsNAFJ3UV62v+Ni5GSyCCymJG3VXdOOygrGFJoTJ9KYdOnkwMynGUPwVNy5U3Poh7vwbp20W2nrgwuGce7n3niDlTCrL+jYqS8srq2vV9drG5tb2jrm7dy+TTFBwaMIT0QmIBM5icBRTHDqpABIFHNrB8Hrit0cgJEviO5Wn4EWkH7OQUaK05JsHboHdEVD84LNTnPvMHWPsm3WrYU2BF4ldkjoq0fLNL7eX0CyCWFFOpOzaVqq8ggjFKIdxzc0kpIQOSR+6msYkAukV0+vH+FgrPRwmQles8FT9PVGQSMo8CnRnRNRAznsT8T+vm6nwwitYnGYKYjpbFGYcqwRPosA9JoAqnmtCqGD6VkwHRBCqdGA1HYI9//Iicc4alw371qo3r8o0qugQHaETZKNz1EQ3qIUcRNEjekav6M14Ml6Md+Nj1loxypl99AfG5w8NApPa</latexit>

= stimuli and spike trains such that falls in bin j

<latexit sha1_base64="+nY3WdWTDls0W+HqppxAy1lUp3A=">AAACAHicbVC7TsMwFHXKq5RXgAWJxaJCYqoSFmCrYGEsEqFITRQ5jtNadezIdiqqqCz8CgsDIFY+g42/wU0zQMuRrnR8zr3yvSfKGFXacb6t2tLyyupafb2xsbm1vWPv7t0pkUtMPCyYkPcRUoRRTjxNNSP3mSQojRjpRsOrqd8dEamo4Ld6nJEgRX1OE4qRNlJoH/gjguFDSKGPY6Fh+czDIQztptNySsBF4lakCSp0QvvLjwXOU8I1ZkipnutkOiiQ1BQzMmn4uSIZwkPUJz1DOUqJCoryggk8NkoMEyFNcQ1L9fdEgVKlxmlkOlOkB2rem4r/eb1cJ+dBQXmWa8Lx7KMkZ1ALOI0DxlQSrNnYEIQlNbtCPEASYW1Ca5gQ3PmTF4l32rpouTdOs31ZpVEHh+AInAAXnIE2uAYd4AEMHsEzeAVv1pP1Yr1bH7PWmlXN7IM/sD5/AGsVlcQ=</latexit>

<latexit sha1_base64="1y/OQEwtflwAWPRa9KFhmR3i2GA=">AAAB73icbVA9T8MwFHwpX6V8FRhZLCokpiphAbYKFsYiEVrURpXjOK2p7US2g1RF/RUsDIBY+Tts/BvcNAO0nGTpdHdPfu/ClDNtXPfbqaysrq1vVDdrW9s7u3v1/YN7nWSKUJ8kPFHdEGvKmaS+YYbTbqooFiGnnXB8PfM7T1Rplsg7M0lpIPBQspgRbKz00Oc2GuHB46DecJtuAbRMvJI0oER7UP/qRwnJBJWGcKx1z3NTE+RYGUY4ndb6maYpJmM8pD1LJRZUB3mx8BSdWCVCcaLskwYV6u+JHAutJyK0SYHNSC96M/E/r5eZ+CLImUwzQyWZfxRnHJkEza5HEVOUGD6xBBPF7K6IjLDCxNiOarYEb/HkZeKfNS+b3q3baF2VbVThCI7hFDw4hxbcQBt8ICDgGV7hzVHOi/PufMyjFaecOYQ/cD5/ACdDkCo=</latexit>

= Poisson firing rate in bin j

Fitting the nonlinearity: derivation

Log-likelihood:<latexit sha1_base64="Wb2tlweIcDBGuHRLC0YxyG2IpmI=">AAACGHicbVBLSwMxGMzWV62vqkcvwSIIYtn1oh6EohcPHiq4ttBdlmyatrF5LElWWEr/hhf/ihcPKl5789+YtnuorQMJw8w3JN/ECaPauO6PU1haXlldK66XNja3tnfKu3uPWqYKEx9LJlUzRpowKohvqGGkmSiCeMxII+7fjP3GM1GaSvFgsoSEHHUF7VCMjJWisnsHr2CgUw5hFlEYMNm1l823UfQET2f4CcRSaBOVK27VnQAuEi8nFZCjHpVHQVvilBNhMENatzw3MeEAKUMxI8NSkGqSINxHXdKyVCBOdDiYbDaER1Zpw45U9ggDJ+psYoC41hmP7SRHpqfnvbH4n9dKTeciHFCRpIYIPH2okzJoJBzXBNtUEWxYZgnCitq/QtxDCmFjyyzZErz5lReJf1a9rHr3bqV2nbdRBAfgEBwDD5yDGrgFdeADDF7AG/gAn86r8+58Od/T0YKTZ/bBHzijX1fqnj8=</latexit>

<latexit sha1_base64="xOihw1X6fiePm2lfAdMk4gwa6P0=">AAACDXicbVC9TsMwGHT4LeUvwMhiUVViqhIWYECqYGFCRSK0UhNFjuO0bu0ksh2kKMoTsPAqLAyAWNnZeBvcNgO0nGTpdPedPn8XpIxKZVnfxtLyyuraem2jvrm1vbNr7u3fyyQTmDg4YYnoBUgSRmPiKKoY6aWCIB4w0g3GVxO/+0CEpEl8p/KUeBwNYhpRjJSWfLPpDpGCLtOJEPkjeAHdSCBcuDLjMPdpWdz4o9I3G1bLmgIuErsiDVCh45tfbpjgjJNYYYak7NtWqrwCCUUxI2XdzSRJER6jAelrGiNOpFdMzylhUyshjBKhX6zgVP2dKBCXMueBnuRIDeW8NxH/8/qZis68gsZppkiMZ4uijEGVwEk3MKSCYMVyTRAWVP8V4iHSdSjdYF2XYM+fvEick9Z5y761Gu3Lqo0aOARH4BjY4BS0wTXoAAdg8AiewSt4M56MF+Pd+JiNLhlV5gD8gfH5AxvMm7o=</latexit>

ML estimate: = # spikes / # stim

• piecewise constant model of nonlinearity f• more histogram bins ⇒ more flexible model

<latexit sha1_base64="sEXpHb6JayD4xH6qcmvpjVZVf2Q=">AAACEXicbVC7TsMwFHXKq5RXgJHFokJqlyphAbYKFsYiEVqpqSLHcVpTO4lsBxGFfgMLv8LCAIiVjY2/wWkzQMuRbB2dc6997/ETRqWyrG+jsrS8srpWXa9tbG5t75i7ezcyTgUmDo5ZLHo+koTRiDiKKkZ6iSCI+4x0/fFF4XfviJA0jq5VlpABR8OIhhQjpSXPbGYehQ/wXt+upBy6HKmR4HknplJOGi7TTwXIu21Cz6xbLWsKuEjsktRBiY5nfrlBjFNOIoUZkrJvW4ka5EgoihmZ1NxUkgThMRqSvqYR4kQO8ulKE3iklQCGsdAnUnCq/u7IEZcy476uLCaW814h/uf1UxWeDnIaJakiEZ59FKYMqhgW+cCACoIVyzRBWFA9K8QjJBBWOsWaDsGeX3mROMets5Z9ZdXb52UaVXAADkED2OAEtMEl6AAHYPAInsEreDOejBfj3fiYlVaMsmcf/IHx+QPFE50Z</latexit>

Model:

or <latexit sha1_base64="rJxGMAFxbCzCJXrF21ep8Vyz2wo=">AAACPnicbVC7TsMwFHXKq5RXgZHFUCGVgSphAQYkBAtjEbRFatrIcRzq4jiR7SAqkz9j4RfYWFkYALEy4rQV4nUkS+eec6+u7/ETRqWy7UerMDE5NT1TnC3NzS8sLpWXV5oyTgUmDRyzWFz4SBJGOWkoqhi5SARBkc9Iy786zv3WNRGSxvxcDRLSidAlpyHFSBnJKzfrVVcPPOpm8Ba6+iZnW/AAuomIA09T6FIOz7x+Bt1QIKydLO9eNyUzSwLk9bu5kEHS1dtfWuaVK3bNHgL+Jc6YVMAYda/84AYxTiPCFWZIyrZjJ6qjkVAUM5KV3FSSBOErdEnahnIUEdnRw/szuGmUAIaxMI8rOFS/T2gUSTmIfNMZIdWTv71c/M9rpyrc62jKk1QRjkeLwpRBFcM8TBhQQbBiA0MQFtT8FeIeMjkpE3nJhOD8PvkvaezU9mvOqV05PBqnUQRrYANUgQN2wSE4AXXQABjcgSfwAl6te+vZerPeR60FazyzCn7A+vgEOIuvWA==</latexit>

8

Page 9: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

LNP (Linear-Nonlinear-Poisson) cascade model

Characterization Procedure:

1. Fit filter k using maximum likelihood under assumed nonlinearity.

2. Project stimuli onto k, compute spike rate (mean # spikes / stimulus) in each histogram bin.

stimulus filter Poissonspiking

stimulusk f λ(t)

exponentialnonlinearity

projection onto uk

9

Page 10: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

• output: Poisson process• problem: assumes spiking depends only on stimulus!

stimulus filter Poissonspiking

stimulusk f λ(t)

conditional intensity(spike rate)

exponentialnonlinearity

LNP (Linear-Nonlinear-Poisson)cascade model

10

Page 11: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

conditional intensity(spike rate)

(Truccolo et al 04)

• output: no longer a Poisson process

Poisson GLM with spike-history dependence

post-spike filter

exponentialnonlinearity

probabilisticspiking

stimulus

stimulus filter

+k

h

f

11

Page 12: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

filter output

traditional IF

filter output

“hard threshold”

“soft-threshold” IF

spik

e ra

te

• interpretation: “soft-threshold” integrate-and-fire model

Poisson GLM with spike-history dependence

post-spike filter

exponentialnonlinearity

probabilisticspiking

stimulus

stimulus filter

+k

h

f

12

Page 13: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

GLM dynamic behaviors

post-spike filter h(t)

stimulus

p(spike)

• irregular spiking

filter outputs(“currents”)

13

Page 14: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

GLM dynamic behaviors

post-spike filter h(t)

stimulus

p(spike)

• regular spiking

filter outputs(“currents”)

(Weber & Pillow 2016)

14

Page 15: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

GLM dynamic behaviors

post-spike filter h(t)

• bursting

filter outputs(“currents”)

p(spike)

stimulus

15

Page 16: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

GLM dynamic behaviors

post-spike filter h(t)

stimulus

filter outputs(“currents”)

p(spike)

• adaptation

16

Page 17: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

GLM dynamic behaviors (from Izhikevich)A B C D

E F G H

I J K L

M N O P

tonic spiking phasic spiking tonic bursting phasic bursting

mixed mode type I type II

spike latency resonator integrator rebound spike

rebound burst variabilitybistability I bistability II

50 ms

spike frequencyadaptation

threshold

Figure 6: Suite of dynamical behaviors of Izhikevich and GLM neurons. Each panel,

top to bottom: stimulus (blue), Izhikevich neuron response (black), GLM responses

on five trials (gray), stimulus filter (left, blue), and post-spike filter (right, red). Black

line in each plot indicates a 50 ms scale bar for the stimulus and spike response.

(Differing timescales reflect timescales used for each behavior in original Izhikevich

paper (Izhikevich, 2004)). Stimulus filter and post-spike filter plots all have 100 ms

duration.

19

(Weber & Pillow 2017)

Izhikevich neuron

GLM spikes

stimulus

GLM parameters

17

Page 18: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

multi-neuron GLM

exponentialnonlinearity

probabilisticspiking

stimulus

neuron 1

neuron 2

post-spike filter

stimulus filter

+

+

18

Page 19: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

multi-neuron GLM

exponentialnonlinearity

probabilisticspiking

coupling filters

stimulus

neuron 1

neuron 2

post-spike filter

stimulus filter

+

+

19

Page 20: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

...

time t

GLM equivalent diagram:

spike rate

20

Page 21: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

Uzzell et al (J Neurophys 04)

• stimulus = binary flicker• parasol retinal ganglion cell spike responses

Example dataset

21

Page 22: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

Uzzell et al (J Neurophys 04)

• stimulus = binary flicker• parasol retinal ganglion cell spike responses

Example dataset

22

Page 23: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

YX~k

time

time lag

<latexit sha1_base64="T7110r2UjToEat5gBhsCYBUeHmw=">AAACOHicbZDLThsxFIY9XAqktFy6REhWI6SwiWbYADsEmy5TqSFBmTTyOCfBjS8j+0wgms47sIVX4Um67KpiyxPgJLOA0CPZ+vV/58jHf5JK4TAM/wRLyyurH9bWNyofNz993tre2b10JrMcmtxIY9sJcyCFhiYKlNBOLTCVSGglo4spb43BOmH0D5yk0FVsqMVAcIbeumzUrn63D3vb1bAezoq+F1EpqqSsRm8n2I/7hmcKNHLJnOtEYYrdnFkUXEJRiTMHKeMjNoSOl5opcN18tm5BD7zTpwNj/dFIZ+7riZwp5yYq8Z2K4bVbZFPzf6yT4eCkmwudZgiazx8aZJKiodO/076wwFFOvGDcCr8r5dfMMo4+oUol1nDDjVJM9/N4PCr8BZyOigVwW4LbBYA2LfKfMZq08IFGi/G9F82j+mk9+h5Wz87LZNfJHvlKaiQix+SMfCMN0iSc/CJ35J48BI/B3+Bf8DRvXQrKmS/kTQXPLyBqrTc=</latexit>

model

Stimulus-only GLM

spike responsedesign matrix

23

Page 24: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

time <latexit sha1_base64="T7110r2UjToEat5gBhsCYBUeHmw=">AAACOHicbZDLThsxFIY9XAqktFy6REhWI6SwiWbYADsEmy5TqSFBmTTyOCfBjS8j+0wgms47sIVX4Um67KpiyxPgJLOA0CPZ+vV/58jHf5JK4TAM/wRLyyurH9bWNyofNz993tre2b10JrMcmtxIY9sJcyCFhiYKlNBOLTCVSGglo4spb43BOmH0D5yk0FVsqMVAcIbeumzUrn63D3vb1bAezoq+F1EpqqSsRm8n2I/7hmcKNHLJnOtEYYrdnFkUXEJRiTMHKeMjNoSOl5opcN18tm5BD7zTpwNj/dFIZ+7riZwp5yYq8Z2K4bVbZFPzf6yT4eCkmwudZgiazx8aZJKiodO/076wwFFOvGDcCr8r5dfMMo4+oUol1nDDjVJM9/N4PCr8BZyOigVwW4LbBYA2LfKfMZq08IFGi/G9F82j+mk9+h5Wz87LZNfJHvlKaiQix+SMfCMN0iSc/CJ35J48BI/B3+Bf8DRvXQrKmS/kTQXPLyBqrTc=</latexit>

model model

stimulus portion

spike-historyportion

Stimulus + SpikeHistory GLM

YX~k

spike responsedesign matrix

24

Page 25: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

time <latexit sha1_base64="T7110r2UjToEat5gBhsCYBUeHmw=">AAACOHicbZDLThsxFIY9XAqktFy6REhWI6SwiWbYADsEmy5TqSFBmTTyOCfBjS8j+0wgms47sIVX4Um67KpiyxPgJLOA0CPZ+vV/58jHf5JK4TAM/wRLyyurH9bWNyofNz993tre2b10JrMcmtxIY9sJcyCFhiYKlNBOLTCVSGglo4spb43BOmH0D5yk0FVsqMVAcIbeumzUrn63D3vb1bAezoq+F1EpqqSsRm8n2I/7hmcKNHLJnOtEYYrdnFkUXEJRiTMHKeMjNoSOl5opcN18tm5BD7zTpwNj/dFIZ+7riZwp5yYq8Z2K4bVbZFPzf6yT4eCkmwudZgiazx8aZJKiodO/076wwFFOvGDcCr8r5dfMMo4+oUol1nDDjVJM9/N4PCr8BZyOigVwW4LbBYA2LfKfMZq08IFGi/G9F82j+mk9+h5Wz87LZNfJHvlKaiQix+SMfCMN0iSc/CJ35J48BI/B3+Bf8DRvXQrKmS/kTQXPLyBqrTc=</latexit>

model model

stimulus portion

neuron 1spike-hist

neuron 2spike-hist

neuron 3spike-hist

neuron 4spike-hist

YX~k

spike responsedesign matrix

Stimulus + History + 3 Neuron Coupling GLM

25

Page 26: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

Fitting: Maximum Likelihood

• find filters that maximize the log-conditional probability of the observed data

GLM parametersData

• log-likelihood is concave • no local maxima [Paninski 04]

logP (Y |X) =X

t

yt log �t � �t

• smooth basis for coupling filters

26

Page 27: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

• Concavity: having everywhere downward curvature

• Convexity: having everywhere upward curvature

• “f = concave” iff “-f = convex”

• Maximizing concave function = minimizing a convex function

• both preclude (non-global) local optima

convexity and concavity

concave convex

27

Page 28: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

• A function f is convex if, for any x1, x2, a in [0,1]:

• for continuous scalar functions:

• for vector functions: all eigenvalues of Hessian are ≥0, or

convexity: formal definition

properties of convex functions:• affine maps:

• sums:

• linear functions: concave and convex

28

Page 29: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

log-concavity of GLM likelihood

Theorem (Paninski 2004)

The GLM log-likelihood is concave in the parameters {k,h}, for any stimulus X and any spike data Y, if the nonlinearity f satisfies:

• f is convex• f is log-concave (i.e., log f(x) is a concave function)

Proof

weighted sum of concave funcs: concave

sum of convex funcs: convexconcave function

negative of convex func: concave

sum of two concave functions is concave ⇒ log-likelihood is concave!

log-likelihood

29

Page 30: Generalized Linear Models (GLMs) IIpillowlab.princeton.edu/teaching/statneuro2018/slides/... · 2018. 3. 13. · log-concavity of GLM likelihood Theorem (Paninski 2004) The GLM log-likelihood

log-concavity of GLM likelihood

Theorem (Paninski 2004)

The GLM log-likelihood is concave in the parameters {k,h}, for any stimulus X and any spike data Y, if the nonlinearity f satisfies:

• f is convex• f is log-concave (i.e., log f(x) is a concave function)

Examples of acceptable nonlinearities:

• • • condition: f must grow at least linearly and at most exponentially

Why this matters: no restriction on choice of stimuli! Can easily find ML fits to GLM parameters for any stimuli + spikes

30