{"title": "Signal Detection in Noisy Weakly-Active Dendrites", "book": "Advances in Neural Information Processing Systems", "page_first": 132, "page_last": 138, "abstract": null, "full_text": "Signal Detection in Noisy Weakly-Active \n\nDendrites \n\nAmit Manwani and Christof Koch \n\n{quixote,koch}@klab.caltech.edu \nComputation and Neural Systems Program \n\nCalifornia Institute of Technology \n\nPasadena, CA 91125 \n\nAbstract \n\nHere we derive measures quantifying the information loss of a synaptic \nsignal due to the presence of neuronal noise sources, as it electrotonically \npropagates along a weakly-active dendrite. We model the dendrite as an \ninfinite linear cable, with noise sources distributed along its length. The \nnoise sources we consider are thermal noise, channel noise arising from \nthe stochastic nature of voltage-dependent ionic channels (K+ and Na+) \nand synaptic noise due to spontaneous background activity. We assess the \nefficacy of information transfer using a signal detection paradigm where \nthe objective is to detect the presence/absence of a presynaptic spike from \nthe post-synaptic membrane voltage. This allows us to analytically assess \nthe role of each of these noise sources in information transfer. For our \nchoice of parameters, we find that the synaptic noise is the dominant \nnoise source which limits the maximum length over which information \nbe reliably transmitted. \n\n1 Introduction \n\nThis is a continuation of our efforts (Manwani and Koch, 1998) to understand the informa(cid:173)\ntion capacity ofa neuronal link (in terms of the specific nature of neural \"hardware\") by a \nsystematic study of information processing at different biophysical stages in a model of a \nsingle neuron. Here we investigate how the presence of neuronal noise sources influences \nthe information transmission capabilities of a simplified model of a weakly-active dendrite. \nThe noise sources we include are, thermal noise, channel noise arising from the stochastic \nnature of voltage-dependent channels (K+ and Na+) and synaptic noise due to spontaneous \nbackground activity. We characterize the noise sources using analytical expressions of their \ncurrent power spectral densities and compare their magnitudes for dendritic parameters re(cid:173)\nported in literature (Mainen and Sejnowski, 1998). To assess the role of these noise sources \non dendritic integration, we consider a simplified scenario and model the dendrite as a lin-\n\n\fSignal Detection in Noisy Weakly-Active Dendrites \n\n,(y'L \n\nlsynapse \n\nCable \n\n133 \n\n__ .... ~I Optimal \nDetector \n\nSpike Pe \n\nNo spike \n\n\\ / Measurement \nv \n\nt t t t t t t t t t t t t t t t t t t t t t f \n\ny \n\nNoise Sources \n\nx \n\nFigure 1: Schematic diagram of a simplified dendritic channel. The dendrite is modeled a weakly(cid:173)\nactive I-D cable with noise sources distributed along its length. Loss of signal fidelity as it propagates \nfrom a synaptic location (input) y to a measurement (output) location x is studied using a signal \ndetection task. The objective is to optimally detect the presence of the synaptic input I (y, t) (in the \nfonn ofa unitary synaptic event) on the basis of the noisy voltage wavefonn Vm(x, t), filtered by the \ncable's Green's function and corrupted by the noise sources along the cable. The probability of error, \nPe is used to quantify task perfonnance. \n\near, infinite, one-dimensional cable with distributed current noises. When the noise sources \nare weak so that the corresponding voltage fluctuations are small, the membrane voltage \nsatisfies a linear stochastic differential equation satisfied. Using linear cable theory, we ex(cid:173)\npress the power spectral density of the voltage noise in terms of the Green's function of an \ninfinite cable and the current noise spectra. We use these results to quantify the efficacy of \ninformation transfer under a \"signal detection\" paradigm 1 where the objective is to detect \nthe presence/absence of a presynaptic spike (in the form of an epsc) from the post-synaptic \nmembrane voltage along the dendrite. The formalism used in this paper is summarized in \nFigure 1. \n\n2 Neuronal Noise Sources \n\nIn this section we consider some current noise sources present in nerve membranes which \ndistort a synaptic signal as it propagates along a dendrite. An excellent treatment of mem(cid:173)\nbrane noise is given in DeFelice (1981) and we refer the reader to it for details. For a linear \none-dimensional cable, it is convenient to express quantities in specific length units. Thus, \nwe express all conductances in units of S/ j.Lm and current power spectra in units of A 2 /Hz \nj.Lm. \n\nA. Thermal Noise \nThermal noise arises due to the random thermal agitation of electrical charges in a con(cid:173)\nductor and represents a fundamental lower limit of noise in a system. A conductor of \nresistance R is equivalent to a noiseless resistor R in series with a voltage noise source \nvth (t) of spectral density SVth (I) = 2kT R (V2IHz), or a noiseless resistor R in parallel \nwith a cu.rrentnoise source, Ith(t) of spectral density SIth(l) = 2kT / R (A2/ Hz), where k \nis the Boltzmann constant and T is the absolute temperature of the conductor2. The trans(cid:173)\nverse resistance Tm (units of 0 j.Lm) ofa nerve membrane is due to the combined resistance \nof the lipid bilayer and the resting conductances of various voltage-gated, ligand-gated and \nleak channels embedded in the lipid matrix. Thus, the current noise due to T m , has power \n\nI For sake of brevity, we do not discuss the corresponding signal estimation paradigm as in Man(cid:173)\n\nwani and Koch (1998). \n\n2Since the power spectra of real signals are even functions of frequency, we choose the double(cid:173)\n\nsided convention for all power spectral densities. \n\n\f134 \n\nspectral density, \n\nA. Manwani and C. Koch \n\n(1) \n\nB. Channel Noise \nNeuronal membranes contain microscopic voltage-gated and ligand-gated channels which \nopen and close randomly. These random fluctuations in the number of channels is another \nsource of membrane noise. We restrict ourselves to voltage-gated K+ and Na+ channels, \nalthough the following can be used to characterize noise due to other types of ionic channels \nas well. In the classical Hodgkin-Huxley formalism (Koch, 1998), a K+ channel consists \nof four identical two-state sub-units (denoted by n) which can either be open or closed. \nThe K+ channel conducts only when all the sub-units are in their open states. Since the \nsub-units are identical, the channel can be in one of five states; from the state in which \nall the sub-units are closed to the open state in which all sub-units are open. Fluctuations \nin the number of open channels cause a random K+ current IK of power spectral density \n(DeFelice, 1981) \n\nSIK(f) = 1]K'YK(Vm - EK) noo f=t \n\n2 \n\n2 4 ~ (4) ( \n\ni \n\n)i 4-i \n\n1 - noo noo 1 + 411'2 j2(8n/i)2' \n\n28n /i \n\n(2) \n\nwhere 1]K, 'YK and EK denote the K+ channel density (per unit length), the K+ single \nchannel conductance and the K+ reversal potential respectively. Here we assume that the \nmembrane voltage has been clamped to a value Vm . noo and 8n are the steady-state open \nprobability and relaxation time constant of a single K+ sub-unit respectively and are in \ngeneral non-linear functions of V m (Koch, 1998). When V m is close to the resting potential \nVrest (usually between -70 to -65 m V), noo \u00ab 1 and one can simplify S I K (f) as \n\nSI K(f) ~ 1]K'YK(Vrest - EK) noo (1- noo) 1 + 411'2 j2(8n/4)2 \n\n2 4 \n\n2 \n\n4 \n\n28n /4 \n\n(3) \n\nSimilarly, the Hodgkin-Huxley Na+ channel is characterized by three identical activation \nsub-units (denoted by m) and an inactivation sub-unit (denoted by h). The Na+ channel \nconducts only when all the m sub-units are open and the h sub-unit is not inactivated. Thus, \nthe Na+ channel can be in one of eight states from the state corresponding to all m sub(cid:173)\nunits closed and the h sub-unit inactivated to the open state with all m sub-units open and \nthe h sub-unit not inactivated. moo (resp. hoo) and 8m (resp. 8h ) are the corresponding \nsteady-state open probability and relaxation time constant of a single Na+ m (resp. h) \nsub-unit respectively. For V m ~ Vrest , moo \u00ab 1, hoo ~ 1 and \n\nSINa(f) ~ 1]Na'YNa Vrest - ENa moo 1 - moo hoo 1 + 411'2 j2(8m /3)2 \n\n)3 2 \n\n3 ( \n\n)2 \n\n2 \n\n( \n\n28m /3 \n\n(4) \n\nwhere 1]Na, 'YNa and ENa denote the Na+ channel density, the Na+ single channel con(cid:173)\nductance and the sodium reversal potential respectively. \n\nC. Synaptic Noise \nIn addition to voltage-gated ionic channels, dendrites are also awash in ligand-gated synap(cid:173)\ntic receptors. We restrict our attention to fast voltage-independent (AMP A-like) synapses. \nA commonly used function to represent the postsynaptic conductance change in response \nto a presynaptic spike is the alpha function (Koch, 1998) \n\ngo:(t) = gpeak e t e-t/tpeak, 0 :::; t < 00 \n\ntpeak \n\n(5) \n\nwhere gpeak denotes the peak conductance change and tpeak the time-to-peak of the con(cid:173)\nductance change. We shall assume that for a spike train s(t) = ~j &(t - tj), the postsy-\nnaptic conductance is given gSyn(t) = ~j go:(t - tj). This ignores inter-spike interaction \n\n\fSignal Detection in Noisy Weakly-Active Dendrites \n\n/35 \n\nr\u00b7 I \n\nFigure 2: Schematic diagram of the equivalent electrical circuit of a linear dendritic cable. The \ndendrite is modeled as an infinite ladder network. Ti (units ofOlJ-L m) denotes the longitudinal cyto(cid:173)\nplasmic resistance; em (units of FI J-L m) and gL (units of SI J-L m) denote the transverse membrane ca(cid:173)\npacitance and conductance (due to leak channels with reversal potential E L) respectively. The mem(cid:173)\nbrane also contains active channels (K+, Na+) with conductances and reversal potentials denoted by \n(gK, gNa) and (EK, ENa) respectively, and fast voltage-independent (AMPA-like) synapses with \nconductance gSlI n and reversal potential Es lIn \u2022 \n\nand synaptic saturation. The synaptic current is given by iSyn(t) = 9Syn(t)(Vm - ESyn) \nwhere ESyn is the synaptic reversal potential. If the spike train can be modeled as a homo(cid:173)\ngeneous Poisson process with mean firing rate An, the power spectrum ofisyn(t) can be \ncomputed using Campbell's theorem (Papoulis, 1991) \n\nSISyn(f) = 7]Syn An(Vm - ESyn)2 I GnU) 12 , \n\n(6) \nwhere 7]Syn denotes the synaptic density and Gn(f) = 1000 9n(t) exp(-j21r/t) dt is the \nFourier transform of 9n(t). Substituting for 9o(t) gives \n\nS \n\n(/) -\n\nISyn \n\n- T/Syn n \n\nA (e 9peaktpeak(Vm - ESyn))2 \n\n(1 + 41r2 J2t;)2 \n\n(7) \n\n3 Noise in Linear Cables \n\nThe linear infinite cable corresponding to a dendrite is modeled by the ladder network \nshown in Figure 2. The membrane voltage Vm(x, t) satisfies the differential equation \n(Tuckwell, 1988), \n\n[ aVm \n\nri Cm 8t + 9K(Vm - EK) + 9Na(Vm - ENa) \n\n+ gSyn(Vm - ESyn) + gdVm - Ed ] \n\n(8) \n\nSince the ionic conductances are random and nonlinearly related to Vm , eq. 8 is a non(cid:173)\nlinear stochastic differential equation. If the voltage fluctuations (denoted by V) around \nthe resting potential Vrest are small, one can express the conductances as small deviations \n(denoted by g) from their corresponding resting values and transform eq. 8 to \n\n_ \\2 a2V(x, t) \n\naV(x, t) \n\n(1 \n\n/\\ \n\nax2 + T \n\n(9) \nwhere A2 = 1/ (riG) and T = cm/G denote the length and time constant of the mem(cid:173)\nbrane respectively. G is the passive membrane conductance and is given by the sum of \nthe resting values of all the conductances. ~ = gK + gNa + gSyn/G represents the ran(cid:173)\ndom changes in the membrane conductance due to synaptic and channel stochasticity; ~ \n\nat + + U \n\n~)V( t) = In \nG \n\nx, \n\n\f136 \n\n-27 \n\n-28 \n\n_-29 \n\nf \"'-30 \nJ \ni-31 \n.. !-32 \n.. \n\n'ii \n~-33 \n\n-34 \n\n-35 \n\n-360 \n\n0.5 \n\n<>--1> Thermal \n\nK' \nNa' \nSynaptic \n\n-\n\nA. Manwani and C. Koch \n\n-6 \n\n-7 \n\n-8 \ni -9 \n\u00a7-10 \n\n~ r ll \n.. ':-13 \n\n~-12 \n\n-14 \n\n-15 \n\n-16 \n0 \n\n3 \n\n3.5 \n\n0.5 \n\n1.5 \n\n2 \n\n2.5 \n\nf'l'equoncy (Hz) (Log 1.WIo) \n\n3.5 \n\n1.5 \n\n2 \n\n25 \n\nf\\'equancy (Hz) (Log Unitt) \n\nFigure 3: (a) Comparison of current spectra 8r(f) of the four noise sources we consider. Synaptic \nnoise is the most dominant source source of noise and thermal noise, the smallest. (b) Voltage \nnoise spectrum of a I-D infinite cable due to the current noise sources. 8Vth (f) is also shown for \ncomparison. Summary of the ~arameters used (adopted from Mainen and Sejnowski, 1998) : Rm = \n40 kOcm2 , em = 0.75 f..tF/cm , ri = 200 Ocm, d (dend. dia.) = 0.75 f..tm, 'TIK = 2.3 f..tm-1, 'TINa = 3 \nf..tm-1, 'TISyn = 0.1 f..tm-1, EK = -95 mY, ENa = 50 mY, ESyn = 0 mY, EL = Vrest = 70 mV, \"IK = \n\"INa = 20 pS. \n\nVrest ) + lth denotes the total effective current noise due to the different noise sources. In \norder to derive analytical closed-fonn solutions to eq. 9, we further assume that 8 < < 13, \nwhich reduces it to the familiar one-dimensional cable equation with noisy current input \n(Tuckwell, 1988). For resting initial conditions (no charge stored on the membrane at \nt = 0) , V is linearly related to In and can be obtained by convolving In with the Green's \nfunction 9 (x, y, t) of the cable for the appropriate boundary conditions. It has been shown \nthat V (x , t) is an asymptotically wide-sense stationary process (Tuckwell and Walsh, 1991) \nand its power spectrum Sv(x, f) can be expressed in tenns of the power spectrum of In, \nSn(f) as \n\nSn(f) 100 \n\nSv(x, f) = ----cJ2 \n\n-00 IQ(x, x ,f)1 dx \n\n' 2 ' \n\n(10) \n\nwhere Q(x, x' , f) is the Fouriertransfonn of g(x, x', t). For an infinite cable \n\n, \n\ne- T \n\n- ( X-X' ) 2 \n\ng(X,X ,T) = J47rT e \n\n4T \n\n,-00 < X,X < 00,0::; T < 00 \n\n, \n\n(11) \n\nwhere X = X/A, X' = x' / A and T = t / T are the corresponding dimensionless variables. \nSubstituting for g(x, x' , t) we obtain \n\nS (f) = Sn(f) \nV \n\nsin (tan-l~211\"f7\")) \n\n2 A G2 27r /T (1 + (27r /T)2//4 \n\n(12) \n\nSince the noise sources are independent, Sn(f) = SIth(f) + SIK(f) + SINa(f) + \nSISyn(f). Thus, eq. 12 allows us to compute the relative contribution of each of the noise \nsources to the voltage noise. The current and voltage noise spectra for biophysically rele(cid:173)\nvant parameter values (Mainen and Sejnowski, 1998) are shown in Figure 3. \n\n3Using self-consistency, we find the assumption to be satisfied in our case. In general, it needs \n\nverified on a case-by-case basis. \n\n\fSignal Detection in Noisy Weakly-Active Dendrites \n\n137 \n\n4 Signal Detection \n\nThe framework and notation used here are identical to that in Manwani and Koch (1998) \nand so we refer the reader to it for details. The goal in the signal detection task is to \noptimally decide between the two hypotheses \n\nHo \nHI \n\n: y(t) = n(t), \n0 ~ t ~ T \n: y(t) = g(t) * s(t) + n(t), 0 ~ t ~ T \n\n(13) \nwhere n(t), g(t) and s(t) denote the dendritic voltage noise, the Green's function of the \ncable (function of the distance between the input and measurement locations) and the epsc \nwaveform (due to a presynaptic spike) respectively. The decision strategy which minimizes \nthe probability of error Pe = PoP, + PIPm, where Po and PI = (1 - Po) are the prior \nprobabilities of Ho and HI respectively, is \n\nNoise \nSignal + Noise \n\n(14) \nwhere A(y) = P[yIHIl/ P[yIHo] and \u00a30 = Po/(l - Po). P, and Pm denote the \nfalse alarm and miss probability respectively. Since n(t) arises due to the effect of sev(cid:173)\neral independent noise sources, by invoking the Central Limit theorem, we can assume \nthat n(t) is Gaussian, for which eq. 14 reduces to r ~ 'T}. r = Jooo y(t) hd( -t) dt \n\nHl \n\nHo \n\nis a correlation between y(t) and the matched filter hd(t), given in the Fourier do(cid:173)\nmain as Hd(f) = e-j21r,Tg*(f)S*(f)/ Sn(f). g(f) and S(f) are Fourier transforms \nof g(t) and s(t) respectively and Sn(f) is the noise power spectrum. The conditional \nmeans and variances of the Gaussian variable r under Ho and HI are 110 = 0,111 = \nJ~oo IG(f)S(f)12 / Sn(f) df and (76 = (7~ = (72 = 111 respectively. The error prob-\nabilities are given by P, = J1/oo P[rIHo] dr and Pm = J~oo P[rlHtJ dr. The op(cid:173)\ntimal value of the threshold 'T} depends on (7 and the prior probability Po. For equi(cid:173)\nprobable hypotheses (Po = 1 - Po = 0.5), the optimal 'T} = (110 + 111)/2 = (72/2 and \nPe = 0.5 Erfc[(7/2V2]. One can also regard the overall decision system as an effective \nbinary channel. Let M and D be binary variables which take values in the set {Ho, Hd \nand denote the input and output of the dendritic channel respectively. Thus, the system \nperformance can equivalently be assessed by computing the mutual information between \nM and D, I(M; D) = 1i(po (1- Pm) + (1- Po) PI) -Po1i(Pm) - (1- Po),H.(P, (Cover \nand Thomas, 1991) where 1i (x) is the binary entropy function. For equi-probable hypothe(cid:173)\nses, I(M; D) = 1 -1i(Pe ) bits. It is clear from the plots for Pe and I(M; D) (Figure 4) \nas a function of the distance between the synaptic (input) and the measurement (output) \nlocation that an epsc. can be detected with almost certainty at short distances, after which, \nthere is a rapid decrease in detectability with distance. Thus, we find that membrane noise \nmay limit the maximum length of a dendrite over which information can be transmitted \nreliably. \n\n5 Conclusions \n\nIn this study we have investigated how neuronal noise sources might influence and limit the \nability of one-dimensional cable structures to propagate information. When extended to re(cid:173)\nalistic dendritic geometries, this approach can help address questions as, is the length of the \napical dendrite in a neocortical pyramidal cell limited by considerations of signal-to-noise, \nwhich synaptic locations on a dendritic tree (if any) are better at transmitting information, \nwhat is the functional significance of active dendrites (Yuste and Tank, 1996) and so on. \nGiven the recent interest in dendritic properties, it seems timely to apply an information(cid:173)\ntheoretic approach to study dendritic integration. In an attempt to experimentally verify \n\n\f0.45 \n\n0.4 \n-.p.lS \n!!:. \n~ 0.3 \n\nw 15 0.25 f 0.2 \n\n0.1 5 \n\n0.1 \n\n0.05 \n\nI \nI \nI \n\n,-\n\u2022 \n\nI \n\n/ ,/ \n/ / \n\nI \n\nI \n\nI \nI \nI \n\n,. \n\u2022 \nI \n\n// \nI I \" , \n\n,,'\" \n,,'\" ... ,,-\n\" \n.-\n/' \", \n, \" \n\" / \n\n\", .... \n\n0.3 \n0.2 \n\n0.1 \n\n\\\\ \n\\ \n\\ \n' \n\\ \n\\ \n\\ \n\\ \\ \n\\ , \n\\ \n\\ \n\n\\ . \\ \n\n\\ \n\n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\\ \n\n\\ ; \n' \n\\ \n' \n'. \n\\ \n' \n\\ \n\\ \\. \n, \n\" \"' .. \n........ -:. .. .:-.:.- .. -\n\n\\ \n\nl(jlm) \n\n138 \n\nA. Manwani and C. Koch \n\n0. 51-----.-----=::::==:::::::::::::;;~ \n\n,;':: \"::.;--~=:---~-:.\"!'-\n\nl(jlm) \n\n1500 \n\n~~---~~~--~~-1~~~---~1~ \n\nFigure 4: Infonnation loss in signal detection. (a) Probability of Error (Pe ) and (b) Mutual infonna(cid:173)\ntion (I(M;D\u00bb for an infinite cable as a function of distance from the synaptic input location. Almost \nperfect detection occurs for small distances but perfonnance degrades steeply over larger distances \nas the signal-to-noise ratio drops below some threshold. This suggests that dendritic lengths may be \nultimately limited by signal-to-noise considerations. Epsc. parameters: gpeak= 0.1 nS, tpeak = 1.5 \nmsec and ESyn = 0 mY. N syn is the number of synchronous synapses which activate in response to \na pre-synaptic action potential. \n\nthe validity of our results, we are currently engaged in a quantitative comparison using \nneocortical pyramidal cells (Manwani et ai, 1998). \n\nAcknowledgements \n\nThis research was supported by NSF, NIMH and the Sloan Center for Theoretical Neuro(cid:173)\nscience. We thank Idan Segev, Elad Schneidman, Moo London, YosefYarom and Fabrizio \nGabbiani for illuminating discussions. \n\nReferences \n\nDeFelice, LJ. (1981) Membrane Noise. New York: Plenum Press. \n\nCover, T.M., and Thomas, J.A. (1991) Elements of Information Theory. New York: Wiley. \n\nKoch, C. (1998) Biophysics of Computation: Information Processing in Single Neurons. Oxford \nUniversity Press. \n\nMainen, Z.F. and Sejnowski, TJ. (1998) \"Modeling active dendritic processes in pyramidal neurons,\" \nIn: Methods in Neuronal Modeling: From Ions to Networks, Koch, C. and Segev, I., eds., Cambridge: \nMIT Press. \n\nManwani, A. and Koch, C. (1998) \"Synaptic transmission: An infonnation-theoretic perspective,\" \nIn: Kearns, M., Jordan, M. and Solla, S., eds., Advances in Neural Information Processing Systems,\" \nCambridge: MIT Press. \n\nManwani, A., Segev, I., Yarom, Y and Koch, C. (1998) \"Neuronal noise sources in membrane patches \nand linear cables,\" In: Soc. Neurosci. Abstr. \n\nPapoulis, A. (1991) Probability, Random Variables and Stochastic Processes. New York: McGraw(cid:173)\nHill. \n\nTuckweII, H.C. (1988) Introduction to Theoretical Neurobiology: I. New York: Cambridge Univer(cid:173)\nsity Press. \n\nTuckwell, H.C. and Walsh, J.B. (1983) \"Random currents through nerve membranes I. Unifonn \npoisson or white noise current in one-dimensional cables,\" Bioi. Cybem. 49:99-110. \n\nYuste, R. and Tank, D. W. (1996) \"Dendritic integration in mammalian neurons, a century after Cajal,\" \n\n\f", "award": [], "sourceid": 1633, "authors": [{"given_name": "Amit", "family_name": "Manwani", "institution": null}, {"given_name": "Christof", "family_name": "Koch", "institution": null}]}