23
Neurally Driven Prosthetics: Pt. II Study: (Schwartz, et. Al: primate control of robotic arm) http://news.nationalgeographic.com/news/ 2008/05/080530-monkey-video-ap.html

Neurally Driven Prosthetics

Embed Size (px)

DESCRIPTION

final presentation for capstone neuroscience class

Citation preview

Page 1: Neurally Driven Prosthetics

Neurally Driven Prosthetics: Pt. II

Study: (Schwartz, et. Al: primate control of robotic arm)

http://news.nationalgeographic.com/news/2008/05/080530-monkey-video-ap.html

Page 2: Neurally Driven Prosthetics

Signal is derived from M1

• Primary motor cortex produces stereotyped neuron ensemble firing patterns during a particular action

• Strong neural activity-behavior coupling in this area, ideal for a decoding algorithm

• Ensemble firing patterns are vectorial, and encoded by firing rate.

Page 3: Neurally Driven Prosthetics

Cortical ensemble activity and behavioral pairing is strong

• This allows engineers to be confident about designing a patterned output from a neural signal

• Figure shows the high predictability of a neuron’s activity with behavior—machine can predict accurately what the monkey’s arm would do.

Page 4: Neurally Driven Prosthetics

Macaque Brain

Page 5: Neurally Driven Prosthetics

Other motor association cortices

• Occur in parallel processing networks, and work in concert to initiate, guide/modify, and extinguish movements.

• These include: supplementary motor cortex (fine/complex movements), cerebellum (coordination/error suppression), basal ganglia (initiation/extinction) and possibly parietal association cortices involved in internal representations that govern hand manipulation/grasping.

Page 6: Neurally Driven Prosthetics

The interface

• Researchers found that microelectrode arrays in the 10’s are sufficient as a signal source

• Most probably 100’s are needed to accommodate full repertoire of limb movements.

Page 7: Neurally Driven Prosthetics

Interface is closed-loop

• Once the microelectrode array is implanted, it cannot be modified-closed system.

• Manipulation of robot arm is real-time, sampling neuronal recordings once every 30 msec

• Time delay for arm movement is ~150 msec, comparable to the delay seen in a biological arm.

• Visual feedback substitutes for sensory feedback, allowing increased accuracy/agility through learning.

Page 8: Neurally Driven Prosthetics

Predicted vs. Actual

Page 9: Neurally Driven Prosthetics

A little computation (don’t be scared..)

• A mathematical algorithm is used to turn electrical signal patterns from the electrodes into a meaningful set of commands for the robotic arm.• It is a linear and integrative function.• Population Vector Algorithm is used (PVA).

Page 10: Neurally Driven Prosthetics

PVA (Population Vector Algorithm)

• “relies on the directional tuning of each unit, characterized by a single preferred direction in which the unit fires maximally. The real-time population vector is essentially a vector sum of the preferred directions of the units in the recorded population, weighted by the instantaneous firing rates of the units” –(Schwartz, et. Al)

Page 11: Neurally Driven Prosthetics

Visualizing PVA• Neuronal assembly assignment: Upon movement of the arm in one

of 8 directions, excitability of that neuron determines calibration in that direction. Tuning!

Page 12: Neurally Driven Prosthetics

Inverse Kinematics• An additional algorithm is needed to calculate proper joint

positioning.

Since there is such a large number of possible limb positionings (shoulder flexion/extension, adduction/abduction, rotation, and elbow flexion/extension), to avoid computational overload, an algorithm computes the most probable joint positioning using inverse kinematics equations.

Page 13: Neurally Driven Prosthetics

Inverse Kinematics

• To produce such an output for an inverse kinematics equation, only starting and ending points are needed.

• Endpoint in 3D space is calculated before being fed to the algorithm real-time by integrating the endpoint velocity to endpoint position, which is then converted to a joint-angle command to the robot.

Page 14: Neurally Driven Prosthetics

Kinematic Data

Page 15: Neurally Driven Prosthetics

Kinematic Data

Page 16: Neurally Driven Prosthetics

Movement Quality

Different colors represent 4 different targets.Thin grey lines represent the average over all trialsSpace-filling shapes represent the standard deviationGrey balls represent areas where assistance was provided.

Page 17: Neurally Driven Prosthetics

Misc. Methodology

• It takes about 1,000 trials over a period of 1-2 weeks before high accuracy (80-100%) is achieved.

• A training period was necessary, where small automated velocity vectors were added (in the direction of interest) for a handicap

• Kinematics of arm control mirrors bell-shaped profile of a natural arm, but much slower (3-5 sec robotic vs. 1-2 sec biological)

Page 18: Neurally Driven Prosthetics

Hurdles• Engineering: long-term, stable electrodes must be developed if this

technology is to be used for prostheses.• Sizable array of immobile recording, computer and robotic control

hardware must be sized down.• Not autonomic--trained specialist is needed to supervise.

• Neuroscientific: feedback is purely visual (clumsiness during training is typical for visual-only guidance); for optimal interaction with a physical environment, sensory ganglion, mainly pressure sensors, are needed.

• Velliste, et. Al. recorded from the primary motor cortex; but as mentioned before, many areas in the brain with unique properties produce signals which may be useful in guiding a prosthetic device.

Page 19: Neurally Driven Prosthetics

Areas of great promise

• Patients of paralysis or locked-in syndrome could feasibly interact with their environment again.

• Opens up and makes feasible the field of Brain/Machine Interfaces (BMI’s) moving beyond the field of prostheses. Theoretically, as long as neural activity and behavior are coupled stereotypically, we can understand how the information is encoded. BMI engineers can decode it and feed the signal to a computer algorithm which produces an output.

Page 20: Neurally Driven Prosthetics

So maybe one day, we’ll finally say BUH-BYE to this…

Page 21: Neurally Driven Prosthetics

And Hello to this…

Other applications:-Biomimetics (retinal and cochlear implants)-Neural signal feedback therapy-Gaming control

Page 22: Neurally Driven Prosthetics

But more likely this.

Page 23: Neurally Driven Prosthetics

Sources

Alexander GE, Crutcher MD. Neural representations of the target (goal) of visually guided arm movements in three motor areas of the monkey. J Neurophysiol. 1990 Jul;64(1):164-78.

Carmena JM, Lebedev MA, Crist RE, O'Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriquez CS, Nicolelis MA. Learning to control a brain-machine interface for reaching and grasping byprimates. PLoS Biol. 2003 Nov;1(2):E42. Epub 2003 Oct 13.

Chapin JK, Moxon KA, Markowitz RS, Nicolelis MA. Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex.Nat Neurosci. 1999 Jul;2(7):664-70.

Crutcher MD, Russo GS, Ye S, Backus DA. Target-, limb-, and context-dependent neural activity in the cingulate and supplementary motor areas of the monkey. Exp Brain Res. 2004 Oct;158(3):278-88. Epub 2004 Jul 29.

Dornay M, Sanger TD. Equilibrium point control of a monkey arm simulator by a fast learning tree structured artificial neural network.Biol Cybern. 1993;68(6):499-508.

Helms Tillery SI, Taylor DM, Schwartz AB. Training in cortical control of neuroprosthetic devices improves signal extraction from small neuronal ensembles. Rev Neurosci. 2003;14(1-2):107-19.

Liu X, Robertson E, Miall RC. Neuronal activity related to the visual representation of arm movements in thelateral cerebellar cortex.J Neurophysiol. 2003 Mar;89(3):1223-37. Epub 2002 Nov 20.

Serruya M, Hatsopoulos N, Fellows M, Paninski L, Donoghue J. Robustness of neuroprosthetic decoding algorithms. Biol Cybern. 2003 Mar;88(3):219-28.

Taylor DM, Tillery SI, Schwartz AB. Information conveyed through brain-control: cursor versus robot. IEEE Trans Neural Syst Rehabil Eng. 2003 Jun;11(2):195-9. Tillery SI, Taylor DM. Signal acquisition and analysis for cortical control of neuroprosthetics. Curr Opin Neurobiol. 2004 Dec;14(6):758-62. Review.

Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature. 2008 May 28. Wahnoun R, He J, Helms Tillery SI. Selection and parameterization of cortical neurons for neuroprosthetic control. J Neural Eng. 2006 Jun;3(2):162-71. Epub 2006 May 16. Wessberg J, Nicolelis MA. Optimizing a linear algorithm for real-time robotic control using chronic cortical ensemble recordings in monkeys.

J Cogn Neurosci. 2004 Jul-Aug;16(6):1022-35. Wessberg J, Stambaugh CR, Kralik JD, Beck PD, Laubach M, Chapin JK, Kim J, Biggs SJ, Srinivasan MA, Nicolelis MA. Real-time prediction of hand trajectory by ensembles of cortical neurons in primates.

Nature. 2000 Nov 16;408(6810):361-5. Video: http://news.nationalgeographic.com/news/2008/05/080530-monkey-video-ap.htmlElectrode figure: http://www.crunchgear.com/wp-content/uploads/2008/04/windowslivewriterjapansfirstonbraininterfacebeingresearch-9723microelectrode-thumb.jpgInverse kinematics figure: http://staff.aist.go.jp/eimei.oyama/IKPE.GIFBMI figure: http://www.sms.mavt.ethz.ch/flow_man_machineCartoon: http://www.usabilitycorner.com/images/hci.JPGTerminator figure: http://www. igargoyle.com/archives/t800arm.jpg Macaque brain figure: http://www.nature.com/nrn/journal/v6/n3/thumbs/nrn1626-f6.jpg Keyboard figure: www.logitech.com

“Predicted vs. Actual “kinematic trajectory figure: Wessberg, et. Al.

All other figures are from Velliste, et. Al.