Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research
description
Transcript of Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research
Tutorial: Plasticity Revisited -Motivating New Algorithms Based On
Recent Neuroscience Research
Tsvi Achler Tsvi Achler MD/PhDMD/PhD
Approximate Outline and
References for Tutorial
Department of Computer ScienceDepartment of Computer Science
University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A.University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A.
Outline
1. Plasticity is observed in many forms. We review experiments and controversies. • Intrinsic ‘membrane plasticity’
• Synaptic
• Homeostatic ‘feedback plasticity’
• System: in combination membrane and feedback can imitate synaptic
2. What does this mean for NN algorithms?
IntrinsicSynaptic
Homeostatic‘Systems’
Plasticity:
Outline: NN Algorithms
Common computational Issues
• Explosion in connectivity
• Explosion in training
• How can nature solve these problems with the plasticity mechanisms outlined?
Synaptic PlasticityLateral Inhibition
Feedback Inhibition
Algorithms:
1. Plasticity
Intrinsic ‘Membrane’ Plasticity
• Ion channels responsible for activity, spikes
• ‘Plastic’ ion channels found in membrane
• Voltage sensitive channel types: – (Ca++, Na+, K+)
• Plasticity independent of synapse plasticity
Review:G. Daoudal, D, Debanne, Long-Term Plasticity of Intrinsic Excitability:
Learning Rules and Mechanisms, Learn. Mem. 2003 10: 456-465
Intrinsic
Synaptic Plasticity Hypothesis
• Bulk of studies
• Synapse changes with activation
• Motivated by Hebb 1949
• Supported by Long Term Potentiation / Depression (LTP/LTD) experiments
Review:Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment
of riches." Neuron 44(1): 5-21.
Synaptic
LTP/LTD Experiment Protocol
• Establish ‘pre-synaptic’ cell
• Establish ‘post-synaptic’ cell
• Raise pre-synaptic activity to amplitude to A50 where post-synaptic cell fires “50%”
• Induction: high frequency high voltage spike train on both pre & post electrodes
• Plasticity: any changes when A50 is applied
Brain
Pre-synaptic electrode
Post-synaptic electrode
50%A50 A50 ?
Synaptic
Plasticity: change in post with A50
• LTP : increased activity with A50
• LTD : decreased activity with A50
• Can last minutes hours days – Limited by how long recording is viable
Synaptic
Strongest Evidence
Systems w/minimal feedback:
• Motor, Musculature & tetanic stimulation
• Sensory/muscle junction of Aplesia Gill Siphon Reflex
• Early Development: Retina → Ocular Dominance Columns
Synaptic
Variable Evidence
Cortex, Thalamus, Sensory Systems & Hippocampus
• Basic mechanisms still controversial
60 years and 13,000 papers in pubmed
• It is difficult to establish/control when LTP or LTD occurs
Synaptic
LTP vs LTD Criteria is Variable
• Pre-Post spike timing: (Bi & Poo 1998; Markram et al. 1997)
– Pre-synaptic spike before post LTP– Post-synaptic spike before pre LTD:
• First spike in burst most important (Froemke & Dan 2002)
• Last spike most important (Wang et al. 2005)
• Frequency most important: Freq LTP (Sjöström et al. 2001; Tzounopoulos et al. 2004).
• Spikes are not necessary (Golding et al. 2002; Lisman & Spruston 2005)
Synaptic
Many factors affect LTP & LTD
• Voltage sensitive channels ie. NMDA
• Cell signaling channels ie via Ca++
• Protein dependent components
• Fast/slow
• Synaptic tagging
Synaptic
Review:Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment
of riches." Neuron 44(1): 5-21.
Studies of Morphology Unclear
Synapse Morphology and density studies:
• Spine changes ≠ Function changes
• Many other causes of changes in spines:
– Estrus, Exercise, Hibernation, Epilepsy, Irradiation
Review:Yuste, R. and T. Bonhoeffer (2001). "Morphological changes in dendritic
spines associated with long-term synaptic plasticity." Annu Rev Neurosci 24: 1071-89.
Synaptic
Many Components & Variability
• Indicates a system is complex – involving more than just the recorded pre-
synaptic and postsynaptic cells
• Means NN learning algorithms are difficult to justify
• But the system regulates itself
Review of LTP & LTD variability:Froemke, Tsay, Raad, Long, Dan, Yet al. (2006) J Neurophysiol 95(3):
1620-9.
Synaptic
Homeostatic Plasticity
Self-Regulating Plasticity
Networks Adapt to:
Channel Blockers
Genetic Expression of Channels
Homeostatic
• Establish baseline recording
• Bathe culture in channel blocker (2 types)– Either ↑ or ↓ Firing Frequency
• Observe System changes after ~1 day
• Washing out blocker causes reverse phenomena
Adaptation to Blockers
Pre-Synaptic Cell Post-Synaptic Cell
Culture Dish
Pre-synaptic electrode
Post-synaptic electrode
Homeostatic
Homeostatic Adaptation to Blockers
Turrigiano & Nelson (2004)
Pre-Synaptic Cell Post-Synaptic Cell
↑ Frequency →↓ Frequency →Frequency x Strength = Baseline
→ ↓ Synaptic Strength → ↑ Synaptic Strength
Displays Feedback Inhibition
Response
Homeostatic Adaptation to Expression
Cell Channels Involved
1
2
3
Marder & Goaillard (2006)
Cells with different numbers & types of channels show same electrical properties
Homeostatic
Homeostatic Summary
• Adapts networks to a homeostatic baseline
• Utilizes feedback-inhibition (regulation)
Homeostatic
Feedback Inhibition
Pre-Synaptic Cell Post-Synaptic Cell
Feedback Ubiquitously Throughout Brain
Homeostatic
Feedback Throughout Brain
LaBerge, D. (1997) "Attention, Awareness, and the Triangular Circuit". Consciousness and Cognition, 6, 149-181
Thalamus & Cortex
Homeostatic
Modified from Chen, Xiong & Shepherd (2000).
• Feedback loops
• Tri-synaptic connections
• Antidromic Activation
• NO (nitric oxide)
• Homeostatic PlasticityRegulatory Mechanisms Suggest
Pre-Synaptic Feedback
Overwhelming Amount of Feedback Inhibition
Figure from Aroniadou-Anderjaska, Zhou, Priest, Ennis & Shipley 2000
Homeostatic
Summary
• Homeostatic Plasticity requires and maintains Feedback Inhibition
Homeostatic
Feedback Inhibition combined with Intrinsic Plasticity
Can be Indistinguishable from Synaptic Plasticity
‘Systems’ Plasticity
‘Systems’
Pre & Post synaptic cells are never in isolation
Studies:
• In Vivo
• Brain slices
• Cultures: only viable with 1000’s of cells
Culture Dish
Pre-synaptic electrode
Post-synaptic electrode
Many cells are always present in plasticity experiments
Changes in neuron resting activity is tolerated
‘Systems’
Feedback Inhibition Network
Increase pre-synaptic cell activity until recorded postsynaptic cell fires 50%
Then learning is
induced artificially
by activating both
neurons together
∆↓∆↓∆↑∆↓∆↓ ∆↓
but this is rarely considered
Induction can affect all connected post-synaptic cells
With Pre-Synaptic Inhibition
LTP protocol: find pre-synaptic and post-synaptic cellsPre-synaptic cells connect to many post-synaptic cells
Immeasurable changes of all
connected neurons
Causes big change in the recorded neuron
Only the two recorded cells and the synapse between them are considered
‘Systems’
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1LTP LTD Immeasurable changes
of all connected neurons
Baseline
Nor
mal
ized
Act
ivity
Sca
le (0-1)
All Neurons 0.01 Resting ∆ Value
Simulation: Up to 26 Cell Interaction
Causes big change in the recorded neuron
‘Systems’
Significance
Experiments can not distinguish between synaptic plasticity and feedback inhibition
• Membrane voltage Vm allowed Δ ~6mV • 0.01 = ~∆Vm of 0.3 mV • Thus not likely to see membrane affects
• Presynaptic cells connect to >> 26 cells– Effect much more pronounced in real networks
‘Systems’
Regulatory Feedback Plasticity
• Feedback Inhibition + Intrinsic Plasticity are indistinguishable in current experiments from Synaptic Plasticity theory
• Why have ‘apparent’ synaptic plasticity?
• Feedback Inhibition is important for processing simultaneous patterns
‘Systems’
2. Algorithms
Synaptic PlasticityLateral Inhibition
Feedback Inhibition
Y1 Y2 Y3
x1 x2 x3x4
w11
w12w13
w21w22
w23
w31w32
w33
w41
w42
w43
Weights
Neural Networks
Challenges In Neural Network Understanding
lw13
lw12 lw23
Large Network Problems
Lateral Connections: connectivity explosion
Limited Cognitive Intuition
0.8
?
What would a weight variable between them
mean?
Millions of representations possible
-> a connection required to logically relate between representations
Lateral Connectivity
Can lead to an implausible number of connections and variables
Every representation can not be connected to all others in the brain
Combinatorial Explosion in Connectivity
Y1 Y2 Y3
x1x2 x3 x4
Y1 Y2 Y3
x1 x2 x3x4
w11
w12w13
w21w22
w23
w31w32
w33
w41
w42
w43
Weights
Neural Networks
Challenges In Neural Network Understanding
lw13
lw12 lw23
Large Network Problems
Weights: combinatorial training
Lateral Connections: connectivity explosion
Superposition Catastrophe
• Teach A B C … Z separately
• Test multiple simultaneous letters A A B B C D E
Not Taught with simultaneous patterns:
Will not recognize simultaneous patterns
Teaching simultaneous patterns is a combinatorial problem
A D G E
Weights: Training Difficulty Given Simultaneous Patterns
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
• Teach A B C … Z separately
• Test multiple simultaneous letters
Weights: Training Difficulty Given Simultaneous Patterns
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
‘Superposition Catastrophe’ (Rosenblatt 1962)
Can try to avoid by this segmenting each pattern individually but it often requires recognition or not possible
A D G E
Composites Common
Segmentation not trivial
Segmentation is not possible in most modalities
• Natural Scenarios (cluttered rainforest)
• Scenes
• Noisy ‘Cocktail Party’ Conversations
• Odorant or Taste Mixes
(requires recognition?)
Chick
Frog Chick & Frog Simultaneously
If can’t segment imagemust interpret composite 0
101
1001
0101
1001
1102
+ =
Segmenting Composites
New Scenario:Learn:
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
Y1 Y2 Y3
x1 x2 x3x4
Weights
Neural Networks
Challenges In Neural Network Understanding
Large Network Problems
Weights: combinatorial training
Lateral Connections: connectivity explosion
Feedback Inhibition: avoids combinatorial issues interprets composites
Feedback Inhibition
Every output inhibits only its own inputs
• Gain control mech for each input
• Massive feedback to inputs
• Iteratively evaluates input use
• Avoids optimized weight parameters
Input
Output
Input Output
Control TheoryPerspective
NeurosciencePerspective
x1 x2
ya
I2 I1
yb
x1 x2
Network
Feedback Inhibition
Equations Used
bX
Xb Raw Input Activity
x1 x2
ya
I2 I1
yb
x1 x2
Feedback Inhibition
Equations
Xb Raw Input Activity Ib Input after feedbackQb Feedback x1 x2
ya
I2 I1
yb
x1 x2
b
bb
Q
XI
Feedback Inhibition
Output
Equations
Inhibition
aYi
ia
aa I
n
tYttY
)()(
b
bb
Q
XI
x1 x2
ya
I2 I1
yb
x1 x2
Q1=ya+ybQ2=yb
Feedback bX
jb tYQ )(j
= =
Ya Output ActivityXb Raw Input Activity Ib Input after feedbackQb Feedback na # connections of Ya
W
Feedback Inhibition
Output
Equations
Inhibition
Feedback
aYi
ia
aa I
n
tYttY
)()(
bX
jb tYQ )(
b
bb
Q
XI
x1 x2
ya
I2 I1
yb
x1x2
Q1=ya+ybQ2=yb
j
= =
RepeatNo OscillationsNo Chaos
Feedback Inhibition
Y1 Y2 Y3 Y4Output Nodes
Input NodesI4I1 I2 I3I1 I2 I3
Y2
Simple Connectivity
x1 x2 x3x4
W
New node only connects to its inputsAll links have same strengthSource of Connectivity Problems Source of Training Problems Inputs have positive real values indicating intensity
Feedback Inhibition
Allows Modular Combinations
I1
Y1 Y2
I1 I2
‘P’ ‘R’
Outputs
Inputs
10
11
Feedback Inhibition
Interprets Composite Patterns
Inputs 1 , 0 1 , 0
1 , 1 0 , 1
Supports Non-Binary Inputs
x1 x2
y1 y2
Inputs simultaneously supporting both outputs
Network Configuration Steady State
Inputs x1 , x2
Outputs y1 , y2Outputs
2 , 2 0 , 2
2 , 1 1 , 1
‘P’
Behaves as if there is an inhibitory connection
yet there is no direct connection between x2 & y1
( - )
‘R’
2Rs
P&R
Solution y1 y2
x1≥ x2 x1–x2 x2
x1≤ x2 0 (x1+x2)/2
Algorithm
Y2
Iterative Evaluation
I2 I1
Y1
x1 x2
Outputs
Inputs
How it Works
Feedback Inhibition Algorithm
Back
I2 I1
x1 x2
Y2 Y1Outputs
Inputs
Feedback Inhibition Algorithm
How it Works
Y2
Forward
I2 I1
Y1
x1 x2
Outputs
Inputs
Feedback Inhibition Algorithm
How it Works
Back
I2 I1
x1 x2
Y2 Y1Outputs
Inputs
Feedback Inhibition Algorithm
How it Works
I2 I1
Y2 Y1Outputs
Inputs
Active (1)
Inactive (0)
11
=
Feedback Inhibition Algorithm
How it Works
C2
I2 I1
Outputs
Inputs
Active (1)
Inactive (0)
Initially both outputs become active
Feedback Inhibition Algorithm
How it Works
C2
I2
Active (1)
Inactive (0)
Outputs
Inputs
I1 gets twice as much inhibition as I2
Feedback Inhibition Algorithm
How it Works
C2
I2
Active (1)
Inactive (0)
Outputs
Inputs
I1 gets twice as much inhibition as I2
Feedback Inhibition Algorithm
How it Works
Outputs
Inputs
Active (1)
Inactive (0)
Feedback Inhibition Algorithm
How it Works
Outputs
Inputs
Active (1)
Inactive (0)
This affects Y1 more than Y2
Feedback Inhibition Algorithm
How it Works
I2
Outputs
Inputs
Active (1)
Inactive (0)
This separation continues iteratively
Feedback Inhibition Algorithm
How it Works
Outputs
Inputs
Active (1)
Inactive (0)
This separation continues iteratively
Feedback Inhibition Algorithm
How it Works
I2 I1
Steady State
00
1
1 5
Act
ivit
y
Y1
Y2
Simulation Time (T)
Graph of Dynamics
32 4
Outputs
Inputs
Until the most encompassing representation predominates
11=
11
10
Y1 Y2
How it Works
‘R’
Demonstration: Appling Learned Information to New Scenarios
• Nonlinear: mathematical analysis difficult – demonstrated via examples
• Teach patterns separately
• Test novel pattern combinations
• Requires decomposition of composite
• Letter patterns are used for intuition
Demonstration
• Learn A B C … Z separately A B C D E
Teach single patterns only
A
01001 ....
B
11000 ....
C
01011 ....
D
10101 ....
E
11011 ....
…….
26 Nodes
Demonstration
Features
Nodes
…….
ModularCombination
This Defines NetworkNothing is changed or re-learned further
Demonstration
Comparison networks are trained & tested with the same patterns– Neural Networks (NN)*
Representing synaptic plasticity– Lateral Inhibition
(Winner-take-all with ranking of winners)
* Waikato Environment for Knowledge Analysis (WEKA) repository tool for most recent and best
algorithms
Tests: Increasingly Complex• 26 patterns presented one at a time
– All methods recognize 100%
• Choose 2 letters, present simultaneously– Either: union logical-‘or’ features– add features
• Choose 4 letters, present simultaneously– Either: add or ‘or’ features– Include repeats in add case (ie ‘A+A+A+A’)
or =
A
01001 ....
B
11000 ....
A|B
11001 ....
ToNetworks+
A+B
12001 ....
325 Combinations
14,950 Combinations
456,976 Combinations
Demonstration
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s
325Combinations
• Train 26 nodes • Test w/2 patterns• Do 2 top nodes match?
0/2 1/2 2/2
Two Patterns Simultaneously A B
Feedback InhibitionLateral Inhibition
or =
A
01001 ....
C
01011 ....
D
10101
.
.
.
.
E
11011 ....
A D C E
A|C|D|E
11111....
or or To Network
Four pattern union
Demonstration
Simultaneous Patterns
Union of Four Patterns :
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A B
C D
14,950Combinations
• Same 26 nodes • Test w/4 patterns• Do 4 top nodes match?
0/4 1/4 2/4 3/4 4/4
Feedback Inhibition
Lateral Inhibition
Union of Five Patterns:
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A B
C D E
65,780 Combinations
• Same 26 nodes • Test w/5 patterns• Do 5 top nodes match?
0/5 1/5 2/5 3/5 4/5 5/5
Feedback InhibitionLateral Inhibition
+ =
A
01001 ....
C
01011 ....
D
10101 ....
E
11011 ....
A D C E
A+C+D+E
23124 ....
+ + To Network
Pattern Addition
Demonstration
Improves feedback inhibition performance further
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A B
C D
0/4 1/4 2/4 3/4 4/4
Lateral InhibitionSynaptic Plasticity
Pre-Synaptic Inhibition
14,950Combinations
X C
O M
K S
A V
Same 26 nodes Test w/4 patterns•Do 4 top nodes match?
Addition of Four Patterns :
Addition of Eight Patterns:
0102030405060708090
100
Letters Correctly Classified
% o
f co
mb
inat
ion
s A G B L
C D X E
Lateral Inhibition
1,562,275 Combinations
• Same 26 nodes • Test w/8 patterns• Do 8 top nodes match?
Feedback Inhibition
0/8 1/8 2/8 3/8 4/8 5/8 6/8 7/8 8/8
• Repeated patterns reflected by
value of corresponding nodes
+ =
Nodes:
A=1
B=2
C=1
D→Z=0
A
01001 ....
C
01011 ....
A B B C
A+B+B+C
24012 ....
+ +
With Addition Feedback Algorithm Can Count
B
11000 ....
B
11000 ....
Demonstration
100% 456,976 Combinations
Tested on Random Patterns
• 50 randomly generated patterns
• From 512 features
• 4 presented at a time
• 6,250,000 combinations (including repeats)
• 100% correct including count
Demonstration
Computer starts getting slow
A+B
12001
This vectoris ‘A’ & ‘B’
together
This vectoris ‘A’ ‘C’‘D’ & ‘E’ together
A+C+D+E
23124
What if Conventional Algorithms are Trained for this Task?
Insight
• Teach pairs: 325 combinations A B
• Teach triples: 2600 combinations
• Quadruples: 14,950.
• Training complexity increases combinatorialy
A C A D A E
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
M
P L
K S
A V
26 letters
Training is not practical
Furthermore ABCD can be misinterpreted as AB & CD, or ABC & D
Insight
Training Difficulty GivenSimultaneous Patterns
Y1 Y2 Y3
x1 x2 x3 x4
w11
w12
w13
w21
w22w23
w31
w32w33
w41w42
w43
Known as: ‘Superposition Catastrophe’ (Rosenblatt 1962; Rachkovskij & Kussul 2001)
A D G E
Feedback inhibition inference
seems to avoid this problem
Insight
Binding problem Simultaneous Representations
Chunking features:
Computer Algorithms
similar problems with simpler representations
Inputs
all are patterns matched
y1
x2 x3
y3
Outputs
x1
y2
x2 x1
‘Wheels’ ‘Barbell’ ‘Car Chassis’
unless the network is explicitly trained otherwise.Given:
However it is a binding error to call this a barbell.
Simultaneous Representations Cause The Binding Problem
0
0.2
0.4
0.6
0.8
1
Ve
cto
r A
cti
vit
y
Feedback Inhibition
‘Wheels’ ‘Barbell’ ‘Car Chassis’
y1 y2 y3
Binding Comparison
x1 x2 x3
y1 y2 y3
Lateral Inhibition
Inputsx1 x2 x3
y1 y2 y3
Binding: Network-Wide Solution
Outputs
1, 0, 0 1, 0, 0 1, 1, 0 0, 1, 0
Inputs Outputs
1, 1, 1 1, 0, 1
x1, x2, x3y1, y2, y3
Wheels
Barbell
Car Barbell
Network Under Dynamic Control
Recognition inseparable from attention
Feedback: an automatic way to access inputs
‘Symbolic’ control via bias
Inputsx1 x2 x3
y1 y2 y3
Symbolic Effect of Bias
Outputs
Inputs Outputs
1, 1, 1 0.02, 0.98, 0.71
x1, x2, x3 y1, y2, y3
Barbell
Interested in y2: Bias y2 = 0.15
Is barbell present? Bias y2 = 0.15
Summary
• Feedback inhibition combined with intrinsic plasticity generates a ‘systems’ plasticity that looks like synaptic plasticity
• Feedback inhibition gives algorithms more flexibility with simultaneous patterns
• Brain processing and learning is still unclear: likely a paradigm shift is needed
Acknowledgements
Eyal Amir
Cyrus Omar, Dervis Vural, Vivek Srikumar
Intelligence Community Postdoc Program & Intelligence Community Postdoc Program & National Geospatial-Intelligence AgencyNational Geospatial-Intelligence Agency
HM1582-06--BAA-0001