IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

397

Synaptic Variability in a Cortical Neuromorphic Circuit Mohammad Mahvash and Alice C. Parker, Fellow, IEEE

Abstract— Variable behavior has been observed in several mechanisms found in biological neurons, resulting in changes in neural behavior that might be useful to capture in neuromorphic circuits. This paper presents a neuromorphic cortical neuron with synaptic neurotransmitter-release variability, which is designed to be used in neural networks as part of the Biomimetic Real-Time Cortex project. This neuron has been designed and simulated using carbon nanotube (CNT) transistors, which is one of several nanotechnologies under consideration to meet the challenges of scale presented by the cortex. Some research results suggest that some instances of variability are stochastic, while others indicate that some instances of variability are chaotic. In this paper, both possible sources of variability are considered by embedding either Gaussian noise or a chaotic signal into the neuromorphic or synaptic circuit and observing the simulation results. In order to embed chaotic behavior into the neuromorphic circuit, a chaotic signal generator circuit is presented, implemented with CNT transistors that could be embedded in the electronic neural circuit, and simulated using CNT SPICE models. The circuit uses a chaotic piecewise linear 1-D map implemented by switched-current circuits. The simulation results presented in this paper illustrate that neurotransmitter-release variability plays a beneficial role in the reliability of spike generation. In an examination of this reliability, the precision of spike timing in the CNT circuit simulations is found to be dependent on stimulus (postsynaptic potential) transients. Postsynaptic potentials with low neurotransmitter release variability or without neurotransmitter release variability produce imprecise spike trains, whereas postsynaptic potentials with high neurotransmitter-release variability produce spike trains with reproducible timing. Index Terms— Carbon nanotube, chaotic signal, cortical neuron, noisy neuron, reliability of spike, stochastic neuron, synaptic variability.

I. I NTRODUCTION

V

ARIABILITY is a prominent feature of biological behavior, playing a central role in the behavior of the neurons in the nervous system [1]. While the purpose of such variability is not completely understood, several neuroscience researchers have indicated that variability might offer distinct advantages. Manwani and Koch [2] provide

Manuscript received September 25, 2011; revised September 13, 2012; accepted November 23, 2012. Date of publication January 9, 2013; date of current version January 30, 2013. This work was supported in part by the WiSE Program at USC, the Viterbi School of Engineering at USC, and the National Science Foundation under Grant 0726815. M. Mahvash is with DirecTV, El Segundo, CA 90245 USA (e-mail: [email protected]). A. C. Parker is with the Ming Hsieh Department of Electrical Engineering, University of Southern California, Los Angeles, CA 90089 USA (e-mail: [email protected]). Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TNNLS.2012.2231879

arguments that indicate variability is helpful, while others suggest that unreliability in transmission in the cortex due to variability is an energy-saving feature, and multiple pathways increase the reliability. Mainen and Sejnowski studied one of the advantages of variability, demonstrating that variability could play a constructive role leading to increased reliability of neuronal firing in single neurons [3]. At the neuronal level, variability could enhance sensitivity to weak signals, a phenomenon that is known as stochastic resonance [4]. Variability could lead to structural plasticity and other forms of learning by causing a postsynaptic neuron to fire. This firing could cause active synapses to be strengthened (spike-timing-dependent plasticity), so that subsequently the neuron could fire even with typical noise-free ion-channel behavior in the axon hillock. In terms of system behavior, variability could assist in moving a neural network from a local minimum. Any artificial neural system designed to be brainlike, or biomimetic, might be enhanced by some variability in behavior. Artificial neurons that spontaneously fire without sufficient postsynaptic potential (PSP) could trigger artificial neural network activity that was unanticipated, but useful. Inside the central nervous system, the two main mechanisms that possess intrinsic variability are the neuronal firing mechanism and the synapse. Among the two main sources of intrinsic variability, namely, synaptic variability and ion-channel variability, this paper focuses on synaptic variability. There are many variable mechanisms in central neurons, including synaptic mechanisms. Researchers have observed synapse variability in postsynaptic responses. It is unclear whether synapse responses are random, or there is an underlying unknown process in synapses that only appears to be random. There is evidence that shows the source of this variability is noisy behavior on the part of a neural mechanism [5], [6]. However, some researchers believe that synapse variability arises from a complex deterministic process [7], [8]. This process could be chaotic. Chaotic behavior is deterministic and highly nonlinear behavior that can be characterized with nonlinear mathematics. King studied the diversity of nonlinear characteristics of the neuron and synapse and proposed several chaotic models for neural processes [9]. Chaotic prediction in a neural network is a different aspect of the application of chaotic behavior [10]. In this paper, we consider both sources of variability, i.e., noise and chaos. During synaptic processing of presynaptic action potentials, there are several steps that generate variability such as the spontaneous opening of intracellular Ca2+ channels, synaptic Ca2+ channel noise, spontaneous fusion of a vesicle release pathway, spontaneous fusion of a vesicle with the membrane,

2162–237X/$31.00 © 2013 IEEE

398

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

and neurotransmitter-release variability [11]–[15]. We will focus on the stochastic release of neurotransmitters in this paper. Neurotransmitters are the means by which signaling occurs between the presynaptic neuron and the postsynaptic neuron. Katz and colleagues pioneered the study of the synaptic transmitter release [16], [17]. They demonstrated that the synaptic transmitter release is stochastic, so that PSPs are made of a fluctuating number of basic units or quanta. Variability in the number of neurotransmitter molecules released per vesicle (∼ 2000) arises owing to variations in vesicle size [18] and vesicular neurotransmitter concentration [19]. Our overarching goal for the biomimetic real-time cortex (BioRC) project is to demonstrate complex neural networks that possess memory and learning capability. To this end, we believe that the behavior of such networks would be enhanced by the addition of variability. This paper describes the design of a carbon nanotube (CNT) neuromorphic cortical neuron with neurotransmitter-release variability. Neurotransmitterrelease variability is modeled at the circuit level using CNT circuit elements. We include a choice of two different types of signal variabilities in the circuit—a signal with Gaussian noise, and a chaotic signal. These signals are simulated as if they were generated internally in a synapse circuit to vary the neurotransmitter release in an unpredictable manner. Variation in neurotransmitter concentration in the synaptic cleft causes a change in the peak magnitude and duration of the PSP. For Gaussian noise, we include a file in our SPICE simulation [20] consisting of random voltage samples that control neurotransmitter release volume. In actual circuits, device variability due to thermal effects, when amplified, could be used as a source of variability [21]. For chaotic signals, we present a chaotic signal generator circuit design and simulation using CNT transistor SPICE models, the output of which would likewise control neurotransmitter release. The circuit uses a chaotic piecewise linear 1-D map implemented with switched-current circuits that can operate at high frequencies to generate a chaotic output current. The scale of neuromorphic (neuromimetic) circuits that will model portions of the brain requires implementation in the most advanced technologies, scaled as aggressively as possible to accommodate the sheer numbers of synapses present. Deep submicrometer CMOS and nanotechnologies pose unique challenges for circuit construction. For this reason, we began to investigate nanotechnologies, in particular CNTs [22] and memristors [23]. The use of memristors in neuromorphic synapses and their role in modeling memory has been demonstrated (e.g. [24]–[26]). In this paper, we focus on CNTs, but continue to investigate other nanotechnologies. CNTs may support the scale and interconnection density of a synthetic cortex. They are extremely small (a few nanometers in diameter), current flow is largely ballistic (like the flow of electrons in free space), capacitances are in attofarads, and rise and fall times are in picoseconds. Channel resistance is primarily due to the quantum resistance at the junction between the nanotubes and metallic connections, due to the differences in energy levels of the electrons, creating a challenge for analog circuit design since resistance cannot be adjusted easily. Current flow between the drain and the source

is typically increased by using parallel nanotubes. Appropriate interfaces can convert to biological signal levels and delays. CNTs have been shown to form dendritic structures, which may facilitate their use in synthetic neurons [27]. Finally, nanotubes have been shown to induce minimum immune system reactions in living tissue, making prosthetic devices with CNTs desirable [28]. We designed and simulated a CNT transistor circuit model of a neural synapse that captures, in a coarse manner, the actions of neurotransmitters, ion channel mechanisms, and temporal summation of PSPs. We have focused on excitatory PSPs (EPSPs) first, and have chosen economy of size over exact replication of waveforms, to facilitate scaling to corticalsized biomimetic structures [22]. We have constructed a voltage adder [29] to implement some dendritic computations, and have simulated this circuit using CNT SPICE models. This adder is tunable to support sublinear to superlinear summations of PSPs. We have constructed a small portion of a dendritic arbor, and shown how action potentials impinging on the presynaptic terminals of the arbor produce output PSPs that are a function of the EPSPs invoked at each synapse of the arbor. We have constructed a CNT synapse in the lab [22] and were the one of the first groups to demonstrate this use of nanotechnology. Single-walled CNTs avoid most of the fundamental scaling limitations of silicon, making them suitable for a synthetic cortex [30]. Liu et al. have demonstrated directional growth of high-density single-walled CNTs on a- and r-plane sapphire substrates over large areas [31]. This technique may enable registration-free fabrication of nanotube devices and lead to integrable and scalable nanotube systems, including synthetic cortex circuits. They have developed a novel nanotube-oninsulator (NOI) approach, and a way to transfer these nanotube arrays to flexible substrates. Efforts have been made in recent years on modeling CNT field-effect transistors (CNFETs) [32], [33] and CNT interconnects [34], [35] to evaluate the potential performance at the device level. Most of the reported models to date use a single lumped gate capacitance and ideal ballistic model to evaluate the dynamic performance [36], [37]. To evaluate CNFET circuit performance with improved accuracy, a CNFET device model with a more complete circuit-compatible structure and including the typical device nonidealities was constructed [38]. This recent publication presents a novel circuit-compatible compact SPICE model for short channel length (5–100 nm) quasi-ballistic CNFETs. This model includes practical device nonidealities, e.g., the quantum confinement effects in both circumferential and channel length direction, the acoustical/optical phonon scattering in channel region, and the resistive source/drain, as well as the real-time dynamic response with a transcapacitance array. This model is valid for CNFETs for a wide diameter range and various chiralities as long as the CNT is semiconducting. Our experiments focus primarily on single ion channels with variability. However, in our neurons, all ion channels are lumped into a single mechanism, so variability in an ion channel in our neurons reflects the collective variability in all ion channels in that neuron. Further, our notion of time is

MAHVASH AND PARKER: SYNAPTIC VARIABILITY IN A CORTICAL NEUROMORPHIC CIRCUIT

scaled up because of the inherent performance of the CNTs in our simulations, where rise times are in picoseconds. However, there would be significant slowing, probably due to biological rates, if neurons with 10 000 synapses were simulated with axons fanning out to 10 000 other postsynaptic neurons. The circuits are wave-shaping circuits with resistance-capacitance time constants that are adjustable to some extent with adjustment of device and circuit characteristics. Slowing down the input and output signals would necessitate some adjustment of parameters, but is not considered to be a major impediment to creating biological timing. While noise is inherent in electronic circuits, our purpose in injecting noise and chaotic signals into our simulated circuits is to demonstrate that noise can affect the outcome of neural network circuits in positive as well as negative ways. Our variability is carefully controlled to correlate with the types of variability observed in biological neurons. Inherently noisy circuits, on the other hand, would not have the same control over the location and type of variability present in neuromorphic circuits. In this paper we include neurotransmitter-release variability in a neuromorphic neural circuit. One approach to include variability in a neuronal circuit would be to implement a deterministic Hodgkin–Huxley model with added white noise at the circuit level [39]. Such a detailed model, while neuromimetic, would make simulation of multiple neurons prohibitively expensive. We use circuits built in Parker’s BioRC group (e.g., [40]–[42]) because they are designed to expand for greater control over specific mechanisms and to incorporate additional mechanisms such as neurotransmitter regulation, while approximating the details of ion-channel behavior. Also, our approach is to include variability in transistors in the circuit that correspond to biological functions affected by variability, and therefore our approach is more neuromimetic than Chen’s from the point of view of mechanisms that correspond directly to transistors in the electronic circuits [39]. Many researchers have built neuromorphic circuits with no variability (e.g., [43]–[48]). We have reported earlier on our neuromorphic circuits with variable synaptic mechanisms [49]. To date, we are not aware of biomimetic synaptic circuits that have variable behavior apart from this paper by Chen. We are particularly interested in demonstrating the impact of synaptic-release variability on spike-timing regularity. The nervous system sends information using trains of action potentials. Neuroscientists have long debated how spike trains carry information. There are two main approaches in which the spike trains may carry information: rate coding and temporal coding. In temporal coding, the exact timing of the spikes are important and carry information. Therefore, temporal coding involves precise patterns of spikes and, in this case, the reliability of spike-timing has an important impact on the information content of the spike train. Mainen and Sejnowski studied the reliability of spike timing in rat neocortical slices [3]. They applied two types of inputs to the neuron, i.e., inputs without noise and inputs with low noise. The timing of spikes drifted from one trial to the next. By comparing the trial-to-trial results for two types of inputs, they demonstrated that the precision of the spike timing depends

399

Fig. 1. System block diagram of the cortical neuron model with a pyramidal neuron cartoon [41].

on the level of noise in input. Inputs with low noise generate spike trains with reproducible timing. Mainen and Sejnowski used cortical neurons in their study, but this argument also applies to synaptic transmission in sensory pathways [50]. Reliability of spike timing was studied by Cecchi, et al. using the leaky integrate-and-fire model [51]. Overall, several researchers have demonstrated this phenomenon in theory, simulation, and experiments [52]–[54]. These research findings have encouraged us to examine spike-timing characteristics with neurotransmitter-release variability in our neuromorphic circuit simulations. The research results described in this paper demonstrate how neurotransmitter-release variability could play a constructive role leading to increase reliability of neuronal firing in single neurons. A PSP with no neurotransmitter-release variability is applied to an axon hillock SPICE simulation model, which leads to a train of spikes in the simulation. Ion-channel variability in the axon hillock is also included. Trial-to-trial results are studied. Then neurotransmitter-release variability is included in the synapse circuit and a PSP with neurotransmitter-release variability is applied to the axon. Trial-to-trial results are compared with previous results. The timing of the spikes drifted from one trial to the next, and the amount of drift showed that spikes were reliable and reproducible in timing. The results for PSP with variability are more timing-reliable than the PSP with no variability. II. N EUROMORPHIC C ORTICAL N EURON C IRCUIT The cortical neuron model, shown in Fig. 1, consists of three types of submodules: 1) the excitatory and inhibitory synapses; 2) the dendritic arbor; and 3) the axon hillock. We include variability in the excitatory and inhibitory synapse circuits to model neurotransmitter-release variability. Fig. 2 shows a BioRC CNT excitatory synapse circuit [42]. This circuit models cell potentials and neurotransmitter

400

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

Fig. 3.

Block diagram of the chaotic generator circuit.

Fig. 4.

Delay and scaling block at the transistor level.

Fig. 2. Carbon nanotube excitatory synapse where R represents reuptake [42].

concentrations with voltages, with a correspondence between circuit elements and biological mechanisms. The pull-up transistor in the neurotransmitter section modulates the neurotransmitter concentration in the synaptic cleft (the voltage at the synaptic cleft node). The voltage at the gate labeled “Neurotransmitter conc” controls the neurotransmitter release. This causes a change in the EPSP peak amplitude, directly altering the synapse strength. Once the neurotransmitters are released from the presynaptic terminal and bound in the postsynaptic terminal, they will be cleared from the synaptic cleft by reuptake mechanisms R [55]. We used three transistors in series for reuptake mechanisms. Because electrons flow through the nanotubes ballistically (almost as if in free space), the only resistance is quantum resistance at the junction to a metal connection. Adding nanotubes in series with corresponding metal connections increases the circuit resistance to mimic scaled biological time constants, where picoseconds in our circuits corresponds to milliseconds in biological neurons. In the circuit with no variability, the voltage applied to the gate labeled neurotransmitter conc could be a fixed biasing voltage or the voltage could vary as the result of some retrograde process in the synapse arising in the postsynaptic neuron. In the case described in this paper, the gate voltage is either an analog noise signal or a chaotic signal that makes the neurotransmitter release variable. This variable input causes the peak amplitude of the EPSP to be variable and varies the synapse strength stochastically or chaotically. Similarly, we produce variability in a BioRC inhibitory synapse. III. C HAOTIC S IGNAL G ENERATOR C IRCUIT Beside noise as a source of variability, there is another source, i.e., chaotic variability. Chaotic behavior is highly nonlinear behavior that can be characterized with nonlinear mathematics. If the system’s dynamics are highly sensitive to the initial conditions and the initial state of the neural circuitry varies at the beginning of each trial, this leads to different neuronal and behavioral responses. We designed a chaotic signal generator using CNT transistors, simulated the signal generator, and applied the resultant chaotic signal to the axon hillock and the neurotransmitter release control of the synapse.

The focus of variable signal generation circuitry in this paper is on the chaotic signal generator circuit since it is novel, having evolved from a circuit discussed in the literature. Gaussian noise generation circuits are more common [56] and hence will not be discussed here. A logistic map or a sawtooth map is among chaotic maps commonly used for the random number generator designs. We use a chaotic piecewise linear 1-D map because this map can be implemented simply using switched-capacitor or switchedcurrent circuits [57]. The chaotic map used in the design is described by the following recurrence relationship:  ax n + b : x n < 0 (1) x n+1 = f (x n ) = ax n − b : x n ≥ 0 where x i is the i th sample of the generated sequence, and a and b are floating numbers. a must be in the range of [1, 2] to ensure the output x n in the range of [−b, b]. Several authors have designed a chaotic signal generator in CMOS technology [58], [59] using the diagram shown in Fig. 3. We implemented an existing design with switchedcurrent circuits that can operate at high frequencies [60]. We used their design and then converted the circuit to CNT technology. The design has two blocks: the delay and scaling block, and the PWL function block. Fig. 4 shows the delay and scaling block at the transistor level. The circuit is in fact a current mirror with three branches. We adjust the geometric factors of the transistors to control the ratio between currents in different branches as follows: I (M5) = 2I (M3) I (M6) = I (M4)

(2) (3)

I (M8) = a.I (M3) I (M9) = a.I (M7).

(4) (5)

MAHVASH AND PARKER: SYNAPTIC VARIABILITY IN A CORTICAL NEUROMORPHIC CIRCUIT

Fig. 5.

PWL function at the transistor level.

Fig. 6.

When the input current Iin is injected, we have the following equations: I (M6) = I (M4) = Iin + I (M3)

(6)

I (M7) = I (M5) − I (M6) = −Iin + I (M3)

(7)

I (M9) = a.I (M7) = −a.Iin + a.I (M3)

(8)

Iout = I (M8) − I (M9) = a.Iin .

(9)

Therefore the output current is the input current scaled by a factor of a. Two transistors M10 and M12 are two switches that work with nonoverlapping clocks, and M11 and M13 are two variable capacitors. This part of the circuit in fact causes a delay from the input current to the output current. We must make the number of tubes in these transistors big enough to behave like ideal switches. In this design, all transistors must be in the saturation region. Otherwise, the current mirrors do not work properly. This condition is even harder to satisfy in CNT technology compared to the same design in complementary metal-oxide semiconductor (CMOS) technology, because Vdd in CNT technology (900 mV) for the technology we simulated [38] is smaller than Vdd in CMOS and the drain–source voltages for transistors in series are lower. This condition also must be valid for any input current. In this design, the input current is changing from −10 to 10 μA. That changes the drain voltage of M3, but the source and gate voltages of M3 are fixed. Therefore, M3 is at risk of going to the triode region. To make sure that M3 always stays in the saturation region, it is better to adjust the gate–source voltage of M3 very close to the threshold voltage. Other transistors such as M5, M6, M8, and M9 are at risk of leaving saturation too, and that requires fine-tuning the design to work properly. It is not clear how nanotube transistors’ characteristics vary over time, or between different fabrication runs, so robust behavior must be further investigated. Fig. 5, shows the PWL function at the transistor level. In this design, M14, M15, M16, and M17 are two current mirrors.

401

Output versus input current for PWL function when Iin > 0.

We adjusted the size of the transistors to have current b in M16 and M17 (b = 10 μA). When the input current to this block is positive, the drain voltage of M18 and M19 increases. This node is also input of the first inverter in three cascaded inverters. Therefore, the outputs of these inverters are going to be 0, 1, and 0, respectively. So for Iin > 0, M19, M21, M22, and M25 are ON, and M18, M20, M23, and M24 are OFF I (M25) + I (M19) = I (M17)

(10)

Iout = −I (M25) = I (M19) − I (M17) = Iin − b.

(11)

When the input current is negative, the drain voltage of M18 and M19 decreases. Therefore, M18, M20, M23, and M24 are ON, and M19, M21, M22, and M25 are OFF I (M24) + I (M18) = I (M16) Iout = I (M24) = −I (M18) + I (M16) = Iin + b.

(12) (13)

The output current is the sum of the input current and constant current b. We first simulated the circuit for the PWL function separately. For the input current, we applied a current source sweeping from 1 to 19 μA. (We defined a = 1.9 μA and b = 10 μA). The output current is supposed to change from −9 to 9 μA based on the following equation: Iout = Iin − 10 μA (green curve in Fig. 6). As shown in the figure, for input currents 1 to 12 μA, the simulation result (blue curve) follows the equation. However, when the input current goes above 12 μA, the output current becomes positive and does not follow a linear change as expected. The reason is that, for an input current more than 10 μA, the output current switches from negative to positive and the direction of current in M25 (Fig. 4) is reversed. The current goes from node 3 to node 6 and, in fact, the source and drain of the transistor M25 are reversed. The voltage at node 3 increases and the current in M17 rises above the constant current of 10 μA, which forces the circuit to work nonlinearly. The same problem occurs for the negative input current. When the input current changes from −1 to −10 μA, the

402

Fig. 7.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

Block diagram of the improved chaotic generator circuit.

Fig. 9. Fig. 8.

Output versus input current for the PWL function shown in Fig. 7.

Improved delay and scaling block at the transistor level.

output current falls linearly from 9 to 0 μA. However, when the input current goes below −10 μA, the output current switches from positive to negative and its change is not linear any more. By limiting the input current to the PWL function block from [−19 μA, 19 μA] to [−10 μA, 10 μA] as shown in Fig. 7, we can remove the nonlinear part of the result and the circuit operates linearly. We divide the output current of the scaled delay block by 2 and send half to the PWL function block and add half to the output of the PWL function block. Since the PWL function block is a linear function and its gain is 1, it does not matter whether we send half of the current to the input and then we add the other half to the output of this block. At the transistor level, the scaled delay circuit changes as shown in Fig. 8. We made the width of M8 and M9 half to get half the current and then added M26 and M27 which are of the same size as M8 and M9. There is no change in the PWL function block at the transistor level. We simulated the PWL function block for the input current from −9.5 to 9.5 μA. As shown in Fig. 9, for the positive input current, the output current follows the following equation: Iout = Iin − 10 μA and for the negative input current, the output current follows the following equation: Iout = Iin + 10 μA, as expected. The chaotic generator circuit generates a current between −9.5 and 9.5 μA at Iout1 (shown in Fig. 8) in a chaotic manner based on (1). In order to use this chaotic current as an input to the neuron circuit, in a separate step we convert the chaotic current simulation results to chaotic voltages for the neural simulation and then adjust the voltage level and the frequency of samples. After the adjustment, the chaotic voltage is described by the following recurrence relationship:  1.9v n + 0.5Vpp − 0.9Vmid : v n < Vmid (14) v n+1 = 1.9v n − 0.5Vpp − 0.9Vmid : v n ≥ Vmid where Vpp is peak to peak voltage of the chaotic signal, and Vmid is the mid-point voltage.

Fig. 10. Chaotic voltage with Vpp = 400 mV, Vmid = 300 mV, Vinit = 480 mV, and period of each sample = 10 ns.

Our goal is to embed the chaotic generator circuit in the neuron circuit as a single circuit. However, for the simulation results in this project, we exported the chaotic current results of the circuit to M ATLAB and, after adjusting the results based on (14), we included the chaotic samples in our SPICE simulation. For the remainder of this paper, we characterize the chaotic voltage applied to the neuron circuit by three parameters, i.e., Vpp , Vmid , and Vinit (initial condition), as well as the period of each sample. For example, Fig. 10 shows a chaotic voltage with Vpp = 400 mV, Vmid = 300 mV, Vinit = 480 mV, and period of each sample = 10 ns. IV. E XPERIMENTS W ITH THE C ORTICAL N EUROMORPHIC C IRCUIT The experimental CNT cortical neuron consists of three excitatory synapses with different strengths, one inhibitory synapse, a dendritic arbor, and the axon-hillock circuits. This neuron was simulated in SPICE. We performed two types of experiments with our cortical neuron. First, we looked at the role of synaptic variability in determining the probability

MAHVASH AND PARKER: SYNAPTIC VARIABILITY IN A CORTICAL NEUROMORPHIC CIRCUIT

403

Fig. 12. Experimental probability of firing when a Gaussian voltage is included for neurotransmitter release control, resulting in synaptic variability. Only one synapse has spike as input. SPICE simulation results. Fig. 11. Input spike, PSPs, dendritic output, and output spike for the neuron with no variability included.

of spiking. Then, we examined the role of synaptic variability and ion-channel variability in spike-timing reliability. We have scaled our circuits so that Vdd = 0.9 V represents approximately the peak-to-peak voltage of the action potential, biologically about 100 mV. The PSPs range in the hundreds of millivolts for our circuits, but in the biological neurons they are more likely to range in the tens of millivolts. These circuit voltages can be scaled to biological levels for prosthetic use and to lower power consumption, with some adjustment of the circuits for subthreshold operation. A. Synaptic Variability We performed several experiments for neurotransmitterrelease variability. First we simulated the neuron with no variability. A spike was applied to the four synapses at the same time. The neurotransmitter concentration voltage controls for excitatory synapses 1–3 are 850, 700, and 550 mV, respectively. Therefore, the three excitatory synapses had different strengths (different peak EPSPs). The neurotransmitter concentration voltage control for the inhibitory synapse is 700 mV. Fig. 11 shows the result. As shown in the figure, the EPSP for excitatory synapse 1 has the highest peak since this is the strongest synapse and then the PSPs for the other synapses are in order based on their strength. The EPSP peak for the second spike is slightly more than the first EPSP peak and so on. The reason is that the first spike is applied when the neuron is at the resting potential. However, the second and third spikes are applied to the neuron before it goes back to the resting potential, so there is temporal summation. The output of the dendrite is a summation of all PSPs, and, when it crosses the threshold voltage (170 mV), the neuron fires. As shown in Fig. 11, the neuron fires when all three excitatory PSPs peak. If we apply a spike to only one synapse instead of applying it to all synapses, the dendritic

output is not sufficient to fire the neuron. Therefore, we included neurotransmitter-release variability and calculated the probability of firing. For example, we applied a spike to the first synapse and, instead of 850 mV fixed biasing voltage for neurotransmitter control voltage, we included a Gaussian voltage with mean μ = 850 mV and standard deviation changing from σ = 0 to 500 mV and period of each sample 10 ps. We assume all other synapses have no spikes as inputs. The probability of firing is shown in Fig. 12. We changed the standard deviation from 0 to 500 mV with a step size of 25 mV. For each standard deviation, we ran the SPICE and M ATLAB experiments 100 times with 100 different Gaussian samples. Among 100 experiments, we counted how many times the neuron fires. The probability is calculated by dividing the number of experiments that the neuron fires by 100. When σ = 0 mV, meaning there is no synaptic variability, the PSP from one synapse is not strong enough to fire the neuron and then the probability is zero. When we allowed synaptic variability and increase the standard deviation, the probability increases. Since synapse 1 is stronger, the PSP generated from synapse 1 is closer to the threshold voltage and therefore the probability for variability in neurotransmitter release for synapse 1 causing the neuron to fire is higher. Synapse 3 generates a PSP that is much smaller than the threshold and even a strong synaptic variability cannot help the neuron to fire without spikes at other synapses in the dendritic arbor. If we assume the PSP signal is Gaussian, we can calculate the probability of firing using a cumulative distribution function of Gaussian and get similar results to do curvematching. This assumption might not necessarily be true. The following function shows the probability of firing for different PSP values. Each PSP value is a Gaussian random number. Fig. 13 shows the result of plotting the following function versus σ in M ATLAB   Vth − μ p(VPSP > Vth ) = 0.5 − 0.5erf √ (15) 2σ 2

404

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

(a)

Fig. 13. Calculated probability of firing when a Gaussian voltage is included for neurotransmitter release control, resulting in synaptic variability. Only one synapse has spike as input. M ATLAB simulation results based on (15).

(b)

Fig. 15. Probability of firing when neurotransmitter-release variability is included in one synapse. (a) SPICE simulation results. (b) M ATLAB simulation results.

Fig. 14. Experimental probability of firing when a chaotic signal is included for neurotransmitter release control, resulting in synaptic variability. SPICE simulation results.

where μ is the mean of the PSP and σ is the standard deviation of the PSP signal, and substituting the nonvariable PSP values for the three synapses as the means of the PSPs. Based on this function, we can conclude that the maximum probability is 0.5 if we include a strong Gaussian neurotransmitter release control (large mean). We did a similar experiment, but instead of Gaussian release control, we used a chaotic control mechanism. Vpp changes from 0 to 1000 mV. Vmid is the same as the fixed biasing voltage for neurotransmitter control voltage, and Vinit changes from Vmid −Vpp /2 to Vmid +Vpp /2. As shown in Fig. 14, traces with the same marking are the results for one synapse with different initial conditions. When Vpp is low, the results for different initial conditions are almost the same; however, for higher Vpp , when the PSP is close to the threshold, the results are different, meaning that the effect of the initial condition on the probability of firing increases for PSPs close to the threshold voltage. We applied a spike to all four synapses (one inhibitory) and included neurotransmitter-release variability in just one

of them each time. We included a Gaussian neurotransmitterrelease variability with μ = 850 mV and σ = 0 mV to σ = 200 mV in the first synapse and calculated the probability of firing. The mean of the Gaussian signal is the same as the fixed biasing voltage in the no-variability experiment because the Gaussian signal has a symmetric variation around this voltage, meaning that the probability of being more than 850 mV is the same as being less (50%). The dendritic output has a variable amplitude and therefore whether the neuron fires or not depends on the dendritic output amplitude. We calculated the probability of firing. We did the same experiments for the variability in synapses 2, 3, and the inhibitory synapse. Fig. 15 shows the probability of firing when the synaptic variability is included in the excitatory and inhibitory synapses. All probabilities are more than 50% because, when there is no variability included, the peak of dendritic output (173.8 mV) is slightly more than the threshold voltage (170 mV) and the neuron fires as shown in Fig. 11 and therefore the probability is 1. When the synaptic variability is included, the variability could push the output of the dendrite below the threshold. Therefore the probability goes down from 1 to 0.5. The probability of firing when the variability in the inhibitory synapse is the lowest, meaning that the inhibitory synapse

MAHVASH AND PARKER: SYNAPTIC VARIABILITY IN A CORTICAL NEUROMORPHIC CIRCUIT

Fig. 17.

Fig. 16. Probability of firing when neurotransmitter-release variability is included in two synapses. M ATLAB simulation results.

is more sensitive to neurotransmitter-release variability than the excitatory synapse. Also, by comparing the probability for three synapses, we can conclude that the neuron is more sensitive to neurotransmitter-release variability included in a weak synapse as compared to variability included in a strong synapse. We did another experiment similar to the previous one but included the variability in two synapses: synapse 1 and synapse 2. Gaussian signals included in two synapses had the same standard deviation and could be correlated. Fig. 16 shows the probability of firing versus standard deviation in the M ATLAB simulation. As shown in Fig. 16, for higher correlation factor (ρ), the probability is lower, meaning that, when the variability of synapses is correlated, the variability is more effective and results in larger changes in the firing probability. We can verify this statement by calculating the standard deviation of a Gaussian signal which is a sum of two Gaussian signals. The standard deviation is as follows:  2 2 + σPSP2 + 2.ρ.σPSP1 .σPSP2 . (16) σPSP = σPSP1 As shown in the above equation, a higher correlation factor means a higher standard deviation and a stronger Gaussian signal. B. Reliability of Spike Timing We did a second experiment without neurotransmitterrelease variability to demonstrate that spike trains could be reproduced reliably with the addition of noise. We included ion-channel variability in the axon hillock. Ion-channel variability is another type of variability [61]. It refers to the stochastic opening and closing of ion channels in the axon hillock. Ion-channel variability affects the firing of the neuron directly, while synaptic variability is generally a more subtle influence affecting the variable release of neurotransmitters. Synaptic variability affects the firing of the neuron as well, but

405

PSP and output spike for three experiments.

in a more indirect manner since there are likely to be thousands of synapses in each cortical neuron we model, each with (possibly correlated) variable behavior. We applied a constant total PSP to the axon hillock. The amplitude of the total PSP (175 mV) was fixed over time and is slightly more than the threshold voltage (170 mV). For ion-channel variability, we included a Gaussian control of ion channels with μ = 450 mV and σ = 360 mV. The result is shown in Fig. 17. Since the total PSP is more than the threshold, the neuron keeps spiking. The refractory period is 10 ps, meaning that when each spike occurs, the neuron will not fire for 10 ps. Since the spike times depend on the levels of ion-channel variability, the spikes for three experiments (red, blue, and green) have different spike times. We characterize spike-timing reliability with a variable called “Reliability.” We ran the experiment 100 times to get a raster plot. This quantity is formed from the number of spikes in 100 trials of raster plot that occurred at approximately the same time divided by the total number of spikes (100 spikes). This quantity is therefore between 0 and 1. For example, when the neuron fires with perfect spike-timing reliability, all spikes in 100 trials of the raster plot occur at the same time; therefore the reliability is 1. Fig. 18 shows the reliability of spike-timing for constant PSP versus standard deviation of ion-channel variability. For ion-channel variability, we included Gaussian variability with μ = 450 mV and σ from 0 to 0.5 V. When there is no ion-channel variability (σ = 0), the reliability is 1. When ion-channel variability increases, the reliability decreases as expected. The spike times change randomly depending on the ion-channel variability level. Fig. 19 shows the reliability of spike-timing for constant PSP versus the PSP amplitude. When the PSP amplitude increases, it overcomes the impact of ion-channel variability on spike times and therefore the reliability increases. Now we include neurotransmitter-release variability (Gaussian with μ = 0.6 V and σ = 2.8 V, and chaotic signal Vpp = 3.9 V and Vmid = 0.6 V) in the synapse circuit. The PSP amplitude changes randomly. Fig. 20 shows a raster plot of 100 trials for spike times for three types of PSP: a constant PSP, a PSP with Gaussian synaptic variability included, and a PSP with chaotic synaptic variability included. The reliability of spike timing for the noiseless PSP is 0.105, for

406

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

Fig. 18. Reliability of spike timing for constant PSP versus standard deviation of ion-channel variability.

Fig. 19. Reliability of spike timing for constant PSP versus PSP amplitude.

the noisy PSP (Gaussian) is 0.412, and for the chaotic PSP is 0.547, meaning when variable PSP is applied to the axon, spike times in the neuron are more reliable. This situation happens when synaptic variability is stronger than ion-channel variability (in Fig. 20, σ (ion-channel variability) = 0.2 V, σ (Gaussian synaptic variability) = 2.8 V, and Vpp (chaotic synaptic variability = 3.9 V). Fig. 21 shows the reliability of spike-timing for variable (Gaussian) PSP versus standard deviation of synaptic variability for three different standard deviations of ion-channel variability. When the synaptic variability standard deviation increases, the reliability increases. This is because, in order to have higher reliability, synaptic variability must be stronger than ion-channel variability, as explained below. Fig. 22 shows the reliability of spike timing for variable (chaotic) PSP versus peak-to-peak voltage of chaotic synaptic variability. When the PSP is constant and has an amplitude around the threshold voltage, ion-channel variability defines spike times

Fig. 20. Raster plot for spike times for constant, variable (Gaussian) PSP, and variable (chaotic) PSP (stimulus) from 100 trials.

across the trials and therefore the neuron generates unreliable or irreproducible spikes. If the PSP amplitude is much higher than the threshold, the neuron fires at the same time in all trials, and a low ion-channel variability cannot change the spike times. The same situation happens for the PSP amplitude much lower than the threshold; ion-channel variability does not have a considerable impact on spike timing. However, when the PSP amplitude is around the threshold, spike times are mainly defined by the ion-channel variability and therefore the neuron generates unreliable spikes across the trials. Now a

MAHVASH AND PARKER: SYNAPTIC VARIABILITY IN A CORTICAL NEUROMORPHIC CIRCUIT

Fig. 21. Reliability of spike timing for variable PSP (Gaussian) versus standard deviation of synaptic variability for three different standard deviations of ion-channel variability (Gaussian).

407

Fig. 23. Reliability of spike timing for variable PSP versus standard deviation of synaptic variability for three different standard deviations of ion-channel variability. M ATLAB simulation.

steps in this research will include design and testing of specific cortical neural networks with and without variability. These circuits will be compared with the experimental literature. We can obtain similar results to experimental results in Fig. 21 by probabilistic calculation of the Gaussian distribution. As we mentioned earlier, in order to get reliable spike timing, the PSP amplitude must be much higher or much lower than the threshold voltage. If we assume variable PSP is a Gaussian signal with mean equal to the threshold voltage, the probability of PSP to be far from the threshold voltage is    (17) p(σ, ) = 1 − p(|VPSP − Vt h | < ) = 1 − erf √ 2σ 2

Fig. 22. Reliability of spike timing for variable PSP (chaotic) versus peakto-peak voltage of synaptic variability for three different standard deviations of ion-channel variability (Gaussian).

strong synaptic variability (Gaussian or chaotic) is included in the synapse, which makes PSP amplitude strongly variable. Variability in PSP is frozen, meaning that the same variable PSP waveform is used repeatedly across all trials. For example, in the raster plot shown in Fig. 20, the same Gaussian signal (for Gaussian synaptic variability) was used for all 100 trials. In this case, high synaptic variability moves the PSP far from the threshold, either much more than the threshold or much less than the threshold, and since it is frozen, the neuron generates reliable spikes. Therefore a PSP with synaptic variability generates reliable spiking provided the amplitude of the synaptic variability is large enough to overcome the ion channel variability. Since the cortical neuron described here is a generic structure with few synaptic inputs, it is not possible to compare it with the experimental literature for neural variability. The next

where σ is the standard deviation of PSP.  is a quantity to determine how far the PSP amplitude is from the threshold voltage. This quantity changes on the basis of the standard deviation of the ion-channel variability. We plot this function by changing the standard deviation of the PSP from 0 to 6 V for three different  values as shown in Fig. 23. The results are approximately the same as results in Fig. 21. V. C ONCLUSION One main source of intrinsic variability in the nervous system, namely, neurotransmitter-release variability in a cortical neuromorphic circuit, was modeled in this paper, and simulation results were shown. Also, a chaotic signal generator circuit in CNT technology was presented. Spike-timing reliability for two types of postsynaptic potential, i.e., constant and variable PSP (because of neurotransmitter-release variability), was analyzed. When synaptic variability is much stronger than ion-channel variability, variability in PSP causes the neuron to fire with more reliable spike times. The design was simulated using CNT SPICE models and M ATLAB.

408

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 24, NO. 3, MARCH 2013

R EFERENCES [1] A. A. Faisal, L. P. J. Selen, and D. M. Wolpert, “Noise in the nervous system,” Nature Rev. Neurosci., vol. 9, pp. 292–303, Apr. 2008. [2] A. Manwani and C. Koch, “Detecting and estimating signals over noisy and unreliable synapses: Information-theoretic analysis,” Neural Comput., vol. 13, no. 1, pp. 1–33, 2001. [3] Z. F. Mainen and T. J. Sejnowski, “Reliability of spike timing in neocortical neurons,” Science, vol. 268, 1503–1506, Jun. 1995. [4] M. D. McDonnell and D. Abbott, “What is stochastic resonance? Definitions, misconceptions, debates, and its relevance to biology,” PLOS Comput. Biol., vol. 5, no. 5, p. e1000348, 2009. [5] W. H. Calvin and C. F. Stevens, “Synaptic noise as a source of variability in the interval between action potentials,” Science, vol. 155, pp. 842–844, Mar. 1967. [6] W. H. Calvin and C. F. Stevens, “Synaptic noise and other sources of randomness in motoneuron interspike intervals,” J. Neurophys., vol. 31, pp. 574–587, Jul. 1968. [7] I. C. Kleppe and H. P. C. Robinson, “Correlation entropy of synaptic input-output dynamics,” Phys. Rev. E, Stat. Nonlin. Soft. Matter Phys., vol. 74, no. 4, pp. 041909–041915, Oct. 2006. [8] P. Faure, D. Kaplan, and H. Korn, “Synaptic efficacy and the transmission of complex firing patterns between neurons,” J. Neurophys., vol. 84, pp. 3010–3025, Jul. 2000. [9] C. C. King, “Fractal and chaotic dynamics in nervous systems,” Progr. Neurobiol., vol. 36, no. 4, pp. 279–308, 1991. [10] D. Li, M. Han, and J. Wang, “Chaotic time series prediction based on a novel robust echo state network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 23, no. 5, pp. 787–799, May 2012. [11] A. Destexhe, M. Rudolph, J. M. Fellous, and T. J. Sejnowski, “Fluctuating synaptic conductances recreate in vivo-like activity in neocortical neurons,” Neurosci., vol. 107, no. 1, pp. 13–24, Nov. 2001. [12] J. M. Fellous, M. Rudolph, A. Destexhe, and T. J. Sejnowski, “Synaptic background noise controls the input-output characteristics of single cells in an in vitro model of in vivo activity,” Neurosci., vol. 122, no. 3, pp. 811–829, 2003. [13] R. Conti, Y. Tan, and I. Llano, “Action potential-evoked and ryanodinesensitive spontaneous Ca2+ transients at the presynaptic terminal of a developing CNS inhibitory synapse,” J. Neurosci., vol. 24, no. 31, pp. 6946–6957, Aug. 2004. [14] S. Q. Wang, L. S. Song, E. G. Lakatta, and H. Cheng, “Ca2+ signalling between single L-type Ca2+ channels and ryanodine receptors in heart cells,” Nature, vol. 410, pp. 592–596, Mar. 2001. [15] X. Lou, V. Scheuss, and R. Schneggenburger, “Allosteric modulation of the presynaptic Ca2+ sensor for vesicle fusion,” Nature, vol. 435, pp. 497–501, May 2005. [16] B. Katz, The Release of Neural Transmitter Substances. Springfield, IL: Thomas, 1969, pp. 1–60. [17] J. Del Castillo and B. Katz, “Quantal components of the end-plate potential,” J. Physiol., vol. 124, pp. 560–573, Jan. 1954. [18] D. Sulzer and R. Edwards, “Vesicles: Equal in neurotransmitter concentration but not in volume,” Neuron, vol. 28, no. 1, pp. 5–7, 2000. [19] X.-S. Wu, L. Xue, R. Mohan, K. Paradiso, K. D. Gillis, and L.-G. Wu, “The origin of quantal size variation: Vesicular glutamate concentration plays a significant role,” J. Neurosci., vol. 27, no. 11, pp. 3046–3056, 2007. [20] L. W. Nagel and R. A. Rohrer, “Computer analysis of nonlinear circuits, excluding radiation,” IEEE J. Solid-State Circuits, vol. 6, no. 4, pp. 166–182, Aug. 1971. [21] J. Alspector, B. Gupta, and R. B. Allen, “Performance of a stochastic learning microchip,” in Neural Information Processing Systems. San Mateo, CA: Morgan Kaufmann, 1988, pp. 748–760. [22] J. Joshi, J. Zhang, C. Wang, C.-C. Hsu, A. C. Parker, C. Zhou, and U. Ravishankar, “A biomimetic fabricated carbon nanotube synapse for prosthetic applications,” in Proc. Life Sci. Syst. Appl. Workshop, Apr. 2011, pp. 139–142. [23] M. Mahvash and A. C. Parker, “A memristor SPICE model for designing memristor circuits,” in Proc. 53rd IEEE Int. Midwest Symp. Circuits Syst., Aug. 2010, pp. 989–992. [24] G. S. Snider, “Spike-timing-dependent learning in memristive nanodevices,” in Proc. IEEE Int. Symp. Nanoscale Archit., Anaheim, CA, Jun. 2008, pp. 85–92. [25] Y. V. Pershin and M. Di Ventra, “Experimental demonstration of associative memory with memristive neural networks,” Neural Netw., vol. 23, no. 7, pp. 881–886, Sep. 2010. [26] Y. V. Pershin and M. Di Ventra, “Memory effects in complex materials and nanoscale systems,” Adv. Phys., vol. 60, no. 2, pp. 145–227, 2011.

[27] A. Cao, X. Zhang, C. Xu, J. Liang, D. Wu, and B. Wei, “Carbon nanotube dendrites: Availability and their growth model,” Mater. Res. Bull., vol. 36, nos. 13–14, pp. 2519–2523, Nov. 2001. [28] A. Bianco, K. Kostarelos, C. D. Partidos, and M. Prato, “Biomedical applications of functionalised carbon nanotubes,” Chem. Commun., no. 5, pp. 571–577, 2005. [29] H. Chaoui, “CMOS analogue adder,” Electron. Lett., vol. 31, no. 3, pp. 180–181, Feb. 1995. [30] A. C. Parker, A. K. Friesz, and A. Pakdaman, “Toward a nanoscale artificial cortex,” in Proc. Int. Conf. Comput. Nanotechnol., Jun. 2006, pp. 227–241. [31] X. Liu, S. Han, and C. Zhou, “A novel nanotube-on-insulator (NOI) approach toward nanotube devices,”Nano Lett., vol. 6, no. 1, pp. 34–39, 2006. [32] K. Natori, Y. Kimura, and T. Shimizu, “Characteristics of a carbon nanotube field-effect transistor analyzed as a ballistic nanowire fieldeffect transistor,” J. Appl. Phys., vol. 97, no. 3, pp. 034306-1–034306-7, 2005. [33] J. Guo, M. Lundstrom, and S. Datta, “Performance projections for ballistic carbon nanotube field-effect transistors,” Appl. Phys. Lett., vol. 80, no. 17, pp. 3192–3194, Apr. 2002. [34] P. Burke, “Carbon nanotube devices for GHz to THz applications,” in Proc. Int. Semicond. Device Res. Symp., 2003, pp. 314–315. [35] A. Naeemi, R. Sarvari, and J. D. Meindl, “Performance comparison between carbon nanotube and copper interconnects for gigascale integration (GSI),” IEEE Electron Device Lett., vol. 26, no. 2, pp. 84–86, Feb. 2005. [36] A. Raychowdhury, S. Mukhopadhyay, and K. Roy, “A circuit-compatible model of ballistic carbon nanotube field-effect transistors,” IEEE Trans. Comput.-Aided Design Integr. Circuits Syst., vol. 23, no. 10, pp. 1411–1420, Oct. 2004. [37] C. Dwyer, M. Cheung, and D. Sorin, “Semi-empirical SPICE models for carbon nanotube FET logic,” in Proc. 4th IEEE Conf. Nanotechnol., Aug. 2004, pp. 386–388. [38] J. Deng and H.-S. P. Wong, “A circuit-compatible SPICE model for enhancement mode carbon nanotube field effect transistors,” in Proc. Int. Conf. Simul. Semicond. Devices Process., Monterey, CA, Sep. 2006, pp. 166–169. [39] H. Chen, S. Saïghi, L. Buhry, and S. Renaud, “Real-time simulation of biologically-realistic stochastic neurons in VLSI,” IEEE Trans. Neural Netw., vol. 21, no. 9, pp. 1511–1517, Sep. 2010. [40] A. C. Parker, J. Joshi, C.-C. Hsu, and N. A. D. Singh, “A carbon nanotube implementation of temporal and spatial dendritic computations,” in Proc. 51st IEEE Midwest Symp. Circuits Syst., Aug. 2008, pp. 818–821. [41] J. Joshi, C. Hsu, A. C. Parker, and P. Deshmukh, “A carbon nanotube cortical neuron with excitatory and inhibitory dendritic computations,” in Proc. IEEE/NIH Life Sci. Syst. Appl. Workshop, Apr. 2009, pp. 133–136. [42] J. Joshi, A. C. Parker, and C.-C. Hsu, “A carbon nanotube cortical neuron with spike-timing-dependent plasticity,” in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., Sep. 2009, pp. 1651–1654. [43] K. Boahen, “Neuromorphic microchips,” Sci. Amer., vol. 292, pp. 56–63, May 2005. [44] K. A. Zaghloul and K. Boahen, “Optic nerve signals in a neuromorphic chip II: Testing and results,” IEEE Trans. Biomed. Eng., vol. 51, no. 4, pp. 667–675, Apr. 2004. [45] E. Farquhar and P. Hasler, “Bio-physically inspired silicon neuron,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 52, no. 3, pp. 477–488, Mar. 2005. [46] K. M. Hynna and K. Boahen, “Neuronal ion-channel dynamics in silicon,” in Proc. IEEE Int. Symp. Circuits Syst., May 2006, pp. 21–24. [47] B. Liu and J. F. Frenzel, “A CMOS neuron for VLSI implementation of pulsed neural networks,” in Proc. IEEE 28th Annu. Conf. Ind. Electron., Soc., Sevilla, Spain, Nov. 2002, pp. 3182–3185. [48] H.-Y. Hsieh and K.-T. Tang, “VLSI implementation of a bio-inspired olfactory spiking neural network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 23, no. 7, pp. 1065–1073, Jul. 2012. [49] M. Mahvash and A. C. Parker, “Modelling intrinsic ion-channel and synaptic variability in a cortical neuromorphic circuit,” in Proc. IEEE Biomed. Circuits Syst. Conf., Nov. 2011, pp. 69–72. [50] P. Kara, P. Reinagel, and R. C. Reid, “Low response variability in simultaneously recorded retinal, thalamic, and cortical neurons,” Neuron, vol. 27, no. 3, pp. 635–646, Sep. 2000.

MAHVASH AND PARKER: SYNAPTIC VARIABILITY IN A CORTICAL NEUROMORPHIC CIRCUIT

[51] G. A. Cecchi, M. Sigman, J. M. Alonso, L. Martinez, D. R. Chialvo, and M. O. Magnasco, “Noise in neurons is message dependent,” Proc. Nat. Acad. Sci. United States America, vol. 97, no. 10, pp. 5557–5561, 2000. [52] R. F. Galan, G. B. Ermentrout, and N. N. Urban, “Optimal time scale for spike-time reliability: Theory, simulations and experiments,” J. Neurophysiol., vol. 99, no. 1, pp. 277–283, 2008. [53] D. A. Butts, C. Weng, J. Jin, C. I. Yeh, N. A. Lesica, J. M. Alonso, G. B. Stanley, “Temporal precision in the neural code and the timescales of natural vision,” Nature, vol. 449, pp. 92–95, Sep. 2007. [54] G. B. Ermentrout, R. F. Galán, and N. N. Urban, “Reliability, synchrony and noise,” Trends Neurosci., vol. 31, no. 8, pp. 428–434. 2008. [55] S. Hardin, “Science shepherd biology,” in Science Shepherd. Scott Hardin, MD: Ohana Life Press, 2010. [56] G. Evans, J. Goes, A. Stieger-Garcao, M. D. Ortigueira, N. Paulino, and J. S. Lopes, “Low-voltage low-power CMOS analogue circuits for Gaussian and uniform noise generation,” in Proc. Int. Symp. Circuits Syst., 2003, pp. 145–148. [57] T. Stojanovski and L. Kocarev, “Chaos based random number generators part I: Analysis,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 48, no. 3, pp. 281–288, Mar. 2001. [58] T. Stojanovski and L. Kocarev, “Chaos based random number generators part II: Practical realization,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 48, no. 3, pp. 382–385, Mar. 2001. [59] C. Wang, J. Huang, H. Cheng, and R. Hu, “Switched-current 3-bit CMOS 4.0 MHz wideband random signal generator,” IEEE J. SolidState Circuits, vol. 40, no. 6, pp. 1360–1365, Jun. 2005. [60] M. Delgado-Restituto, F. Medeiro, and A. Rodriguez-Vazquez, “Nonlinear switched-current CMOS IC for random signal generation,” Electron. Lett., vol. 29, no. 25, pp. 2190–2191, Dec. 1993. [61] M. Mahvash, “Emulating variability in the behaviour of artificial central neurons,” Ph.D. dissertation, Faculty USC Graduate School, Univ. Southern Califonia, Los Angeles, May 2012.

409

Mohammad Mahvash received the B.Sc. degree from the Isfahan University of Technology, Isfahan, Iran, and the M.Sc. degree from the Sharif University of Technology, Tehran, Iran, in 2000 and 2002, respectively, both in electrical engineering, and the M.Sc. and Ph.D. degrees in electrical engineering from the University of Southern California, Los Angeles, in 2008 and 2012, respectively. He is currently with DirecTV, El Segundo, CA, as a Principal Broadcast Systems Engineer.

Alice C. Parker (F’11) received the B.S.E.E. and Ph.D. degrees from North Carolina State University, Raleigh, and the M.S.E.E. degree from Stanford University, Stanford, CA. She is currently a Professor of electrical engineering with the Viterbi School of Engineering, University of Southern California, Los Angeles, where she is involved in research and teaching and was the Vice Provost of Research and Graduate Studies and the Dean of Graduate Studies. She was a Faculty Member with Carnegie Mellon University. She was involved in research on design automation and behavioral synthesis. She is currently involved in research on electronics implementing structural plasticity, dendritic computations and plasticity, retinal electronic circuits, and binocular vision and has collaborated to produce a working carbon nanotube synapse circuit. Her current interests include development of neural circuits using nanotechnological models of circuit elements Dr. Parker was a recipient of the ASEE Sharon Keillor Award and the Northrup Grumman Teaching Award.

Synaptic variability in a cortical neuromorphic circuit.

Variable behavior has been observed in several mechanisms found in biological neurons, resulting in changes in neural behavior that might be useful to...
2MB Sizes 0 Downloads 4 Views