Neural Basis of Cognition
Lecture 7
Learning and Memory: Part III
LTP/LTD
•
In the last lecture, we briefly explored a
candidate biochemical pathway for the induction
of LTP/LTD
•
Questions:
–
Can we model the induction of LTP/LTD?
–
Is this pathway plausible? i.e. if we model this
pathway, will our model neuron exhibit LTP/LTD as
observed in real neurons?
–
If a neural network is equipped with such rules, can it
“learn”?
LTP/LTD
•
Can we model the induction of LTP/LTD?
Yes.
•
Doing so will require some background.
A Mathematical Detour
•
Next, we are going to consider the Hodgkin

Huxley model of the neuron. In order to do
that, we will first review several topics:
–
Ordinary differential equations
–
Equilibrium potential
–
Electrical circuits
•
These tools will give us the ability to
understand the basics of the Hodgkin

Huxley
model
Differential Equations
•
Given a function f(x), the average change in
the function over a time
h
starting at a point
x
is “rise over run,” that is, the change in f(x)
divided by the change in x
–
[f(
x+h
)

f(x)]/h
Differential Equations
•
As we decrease
h
more and more until it
approaches 0, the secant line becomes a line
tangent to f(x) at the point x.
Differential Equations
•
If we do not choose the point x at which we
want to find the tangent line of f(x) and leave
x as a variable, we can find the general
equation for the tangent line of f(x) at any
point x; this equation is the derivative of f(x),
denoted f’(x) or
df
(x)/
dx
Differential Equations
•
Example: f(x) = x^2
[f(
x+h
)
–
f(x)] / h
= [(
x+h
)^2
–
x^2] / h
= [(x^2 + 2xh + h^2)
–
x^2] / h
= (2xh + h^2) / h
= 2x + h
lim
(h

>0) 2x+h = 2x
So, the derivative of x^2 is 2x: f’(x) = 2x
Differential Equations
•
What does the derivative mean? The derivative of a
function describes how quickly the function changes
with time.
•
If f(x) = x^2, f’(x) = 2x; this means that at any point x,
f(x) is increasing by an amount equal to twice that x
•
Therefore, if we know that some quantity x increases
twice as fast as its current value x, we know that the
function that can give the value of that quantity at any
point in time has derivative 2x, and therefore the
function describing the change in that quantity over
time is x^2.
Differential Equations
•
The process of finding a function from its
derivative is called
antidifferentiation
, or, more
commonly, integration.
•
When describing physical systems, it is often
easier to observe how a system changes based on
its current state and to write that down an
equation, called a differential equation;
integrating this equation will lead to an equation
that can describe the state of the system at any
point in time.
Equilibrium potential
•
The concentrations of an ion inside and outside of a cell are
maintained by a combination of electrical and chemical
gradients
•
The Nernst equation can be used to calculate the potential
of an ion of charge
z
across a membrane. This potential is
determined using the concentration of the ion both inside
and outside the cell when it is at rest:
–
E = (RT/
zF
)
ln
([outside]/[inside])
•
R = universal gas constant, 8.314 Joules / (K mol)
•
T = temperature in Kelvin
•
z = charge of ion
•
F = Faraday’s constant, 96485 Coulombs/mol
•
When the membrane potential is equal to E, there is no net
flux of this ion across the membrane
Equilibrium Potential
•
Using the Nernst equation, we can calculate
the equilibrium potentials of Na+ and K+ in a
neuron:
–
If, at resting potential, K+ has concentrations 5mM
outside and 140
mM
inside the cell,
•
E
K
= RT/F*
ln
(5/140) =

0.089 V =

89 mV
–
If, at resting potential, Na+ has concentrations 12
mM
inside and 140mM outside the cell,
•
E
Na
= RT/F*
ln
(140/12) = 0.066 V = 66 mV
Electrical Circuits
•
An electrical circuit is a closed loop through
which current (electricity) runs.
–
“The
voltage
between two points is a short name for
the electrical force that would drive an
electric
current
between those points.”
–
“Electric current
means, depending on the context, a
flow of
electric charge
(a
phenomenon) or the rate of
flow of electric charge (a
quantity).
This flowing
electric charge is typically carried by moving
electrons,
in a
conductor
such as
wire….”
Electrical Circuits
•
“A
resistor
is a two

terminal
electronic component
that
produces a
voltage
across its terminals that
is
proportional
to the
electric current
passing through
it.”
•
“A
capacitor is a
passive
electronic
component
consisting of a pair
of
conductors
separated by a
dielectric
(insulator).
When a
voltage exists across the conductors,
an
electric field
is present in the dielectric. This field
stores
energy.”
–
“An ideal capacitor is wholly characterized by a constant
capacitance
C
, defined as the ratio of charge
±
Q
on each
conductor to the voltage
V
between them: C = Q/V”
Hodgkin

Huxley Model
The Hodgkin

Huxley model is based on an
“equivalent circuit” for the neuron. The
membrane is modeled as a capacitor, the
voltage

gated channels as resistors, and the
electrochemical gradient across the
membrane for each ion as voltage sources.
In 1952, Hodgkin, Huxley, and Katz, based on their
experimental work with the squid giant axon, published
their model describing how action potentials in neurons
are initiated and propagated. They received the 1963
Nobel prize in Physiology or Medicine for their work.
Their model has since been significantly expanded.
Hodgkin

Huxley Model
•
The current through each of the
resistors (i.e. the current through
each of the types of channels) is
proportional to the difference
between the voltage drop across the
membrane and the relevant ion’s
reversal potential, (
V
m
–
E
k
), where k
is the channel type:
i
k
=

g
k
(
V
m

E
k
), so
I
m
=

Σ
k
g
k
(
V
m

E
k
)
•
The proportionality factor
g
k
is the
conductance of the channel (the
conductance of the resistor in the
circuit). However, the conductance is
generally not a constant and instead
changes nonlinearly with the
potential of the neuron.
Hodgkin

Huxley Model
•
Recall that the capacitance of a capacitor
(which is a constant) is described by the
equation C = Q/V, rearrange, and
differentiate with respect to time:
C
m
V
m
= Q
C
m
dV
m
/
dt
=
dQ
/
dt
and the change in charge over time is current, so
C
m
dV
m
/
dt
=
I
m
and
I
m
was defined above, so
C
m
dV
m
/
dt
=

Σ
k
g
k
(
V
m

E
k
)
•
This is the fundamental equation describing
the Hodgkin

Huxley equivalent circuit.
•
Further complexity arises from the fact that
g
k
is also dependent on voltage, which can
make this differential equation more difficult
to solve.
LTP/LTD
•
Is this pathway plausible?
Yes.
LTP/LTD
•
“Calcium Time Course as a Signal for Spike

Timing
–
Dependent Plasticity” by Rubin
et al. 2005
–
Two

compartment model, for the dendrite and the soma
–
“The mechanism for plasticity in our model involves a biophysically plausible calcium
detection system that responds to calcium and then changes the strength of the synapse
accordingly. In the model, three detector agents (P, A, V) respond to the instantaneous
calcium level in the spine compartment. The interactions of these three agents, together
with two others (D, B) that they influence, act to track the calcium time course in the
spine (see Fig. 3A). More specifically, different calcium time courses lead to different
time courses of P and D, which compete to influence a plasticity variable W. This variable
W is used as a measure of the sign and magnitude of synaptic strength changes from
baseline. Note that this scheme is significantly different from a detection of peak
calcium levels, in which the change in W would be determined by how large spine
calcium becomes during an appropriate set of spikes. The interactions between agents
within our detector system qualitatively resemble the pathways influencing the
regulation of Ca2/
calmodulin

dependent protein
kinase
II (
CaMKII
).”
Rubin, J. E., R. C.
Gerkin
, et al. (2005). "Calcium time course as a signal for spike

timing

dependent plasticity."
J
Neurophysiol
93(5): 2600

13.
LTP/LTD
Rubin, J. E., R. C.
Gerkin
, et al. (2005). "Calcium time course as a signal for spike

timing

dependent plasticity."
J
Neurophysiol
93(5): 2600

13.
“Calcium has been proposed as a
postsynaptic signal underlying synaptic spike

timing

dependent plasticity (STDP). We
examine this hypothesis with computational
modeling based on experimental results from
hippocampal
cultures… in which pairs and
triplets of pre

and postsynaptic spikes induce
potentiation
and depression in a temporally
asymmetric way. Specifically, we present a set
of model biochemical detectors, based on
plausible molecular pathways, which make
direct use of the time course of the calcium
signal to reproduce these experimental STDP
results…. Simulations of our model are also
shown to be consistent with classical LTP and
LTD induced by several
presynaptic
stimulation paradigms.”
LTP/LTD
•
If a neural network is equipped with such rules, can it
“learn”?
Yes.
LTP/LTD
•
“Learning Real

World Stimuli in a Neural Network with Spike

Driven Synaptic
Dynamics” by
Brader
,
Senn
,
Fusi
2007
–
“We present a model of spike

driven synaptic plasticity inspired by experimental observations
and motivated by the desire to build an electronic hardware device that can learn to classify
complex stimuli in a
semisupervised
fashion. During training, patterns of activity are
sequentially imposed on the input neurons, and an additional instructor signal drives the
output neurons toward the desired activity. The network is made of integrate

and

fire neurons
with constant leak and a floor. The synapses are
bistable
, and they are modified by the arrival
of
presynaptic
spikes. The sign of the change is determined by both the depolarization and the
state of a variable that integrates the postsynaptic action potentials. Following the training
phase, the instructor signal is removed, and the output neurons are driven purely by the
activity of the input neurons weighted by the plastic synapses. In the absence of stimulation,
the synapses preserve their internal state
indefinitely.Memories
are also very robust to the
disruptive action of spontaneous activity. A network of 2000 input neurons is shown to be able
to classify correctly a large number (thousands) of highly overlapping patterns (300 classes of
preprocessed Latex characters, 30 patterns per class, and a subset of the NIST characters data
set) and to generalize with performances that are better than or comparable to those of
artificial neural networks. Finally we show that the synaptic dynamics is compatible with many
of the experimental observations on the induction of long

term modifications (spike

timing

dependent plasticity and its dependence on both the postsynaptic depolarization and the
frequency of pre

and postsynaptic neurons).”
Brader
, J. M., W.
Senn
, et al. (2007). "Learning real

world stimuli in a neural network with spike

driven synaptic dynamics."
Neural
Comput
19(11): 2881

912.
LTP/LTD
“A network of 2000 input neurons is
shown to be able to classify
correctly a large number
(thousands) of highly overlapping
patterns (300 classes of
preprocessed Latex characters, 30
patterns per class, and a subset of
the NIST characters data set) and to
generalize with performances that
are better than or comparable to
those of artificial neural networks.”
Brader
, J. M., W.
Senn
, et al. (2007). "Learning real

world stimuli in a neural network with spike

driven synaptic dynamics."
Neural
Comput
19(11): 2881

912.
LTP/LTD
“Figure 6: The full Latex data set containing 293
classes. (a) Percentage of
nonclassified
patterns in the
training set as a function of the number of classes for
different numbers of output units per class. Results
are shown for 1(
∗
), 2, 5, 10, and 40(+) outputs per
class using the abstract rule (points) and for 1, 2, and
5 outputs per class using the spike

driven network
(squares, triangles, and circles, respectively).
In all
cases, the percentage of misclassified patterns is less
than 0.1%.
(b) Percentage of
nonclassified
patterns as
a function of the number of output units per class for
different numbers of classes (abstract rule). Note the
logarithmic scale. (c) Percentage of
nonclassified
patterns as a function of number of classes for
generalization on a test set (abstract rule). (d) Same
as for c but showing percentage of misclassified
patterns.”
Brader
, J. M., W.
Senn
, et al. (2007). "Learning real

world stimuli in a neural network with spike

driven synaptic dynamics."
Neural
Comput
19(11): 2881

912.
Comments 0
Log in to post a comment