May 17, 2014 10:56 1403002

International Journal of Neural Systems, Vol. 24, No. 5 (2014) 1403002 (3 pages) c World Scientific Publishing Company  DOI: 10.1142/S0129065714030026

INTRODUCTION

Int. J. Neur. Syst. 2014.24. Downloaded from www.worldscientific.com by MCMASTER UNIVERSITY on 02/10/15. For personal use only.

Published Online 15 May 2014 This is the third special issue in the row on Spiking Neural Networks (SNNs) in the International Journal of Neural Systems. The first special issue was on Advances in Spiking Neural Networks and Applications (issue 20(6), December 2010) and the second special issue was on Spiking Neural Networks and Applications (issue 22(4), August 2012). These two special issues were well received among researchers. Consequently, we received requests and encouragement from various research groups for continuing this effort and a call for another special issue. It was initially aimed to focus on application specific SNN models, architectures, learning algorithms, and their implementations. The interest in this special issue demonstrated by many authors once again proved that SNNs are increasingly receiving attention as computationally powerful and biologically more plausible models of neural information processing. In this Special Issue, we expressly called for a broad range of topics related to Application Specific Spiking Neural Networks. The various contributions received describe recent developments and progress, which can be roughly classified into different areas such as: learning rule, dynamics of cortex model, approximation with SNNs, SNN ensembles, fast implementation of SNNs, and applications of SNNs for optimization. Each included paper received 5–8 reviews and went through two full rounds of reviews. There are a wide range of models of SNNs reported in the literature which are close to the biological metaphor and the choice of which greatly depends on the specific applications. An important issue associated with all models is that the learning and application of SNNs typically are computationally more intensive than that of classical neural networks. A designer, therefore, has to find the tradeoffs of various factors for developing a SNN-based application. The SNN model, architecture and learning algorithm are to be rationalized for particular application in order to optimize computational cost. There has been tremendous effort exerted into these issues by different groups of researchers as evident from the current literature. But there are still much to explore about SNN models, architecture, and learning for SNN-based applications. The first paper in this special issue is on code-specific learning rules for populations of SNN authored by J. Friedrich, R. Urbancziky and W. Senn. A generalization of reinforcement learning for different neuronal codes and multi-valued decision making has been proposed. Population coding is widely regarded as a key mechanism for reliable decisions. The learning rules are either weakly or tightly population code-specific. The population code was extended from binary decision making to multiple choice alternatives and continuous action selections. Further generalization has been proposed for the single neuron code from binary coding features to multi-valued and continuous-valued features, specifically a spike/no-spike, a spike count and a spike latency code have been considered. The learning rules were applied to basic tasks of classification and regression. It has been demonstrated how the code-specific rules enable a spiking neural network to select continuous actions in the realistic paradigm of Morris’ water maze and boosts performance. Both code-specific learning rules improve in performance with increasing population size as opposed to standard reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space. Developing a cortex model has long been desired by researchers. The second paper introduces a cortex model represented by a multi-layer, multi-column network of spiking neurons, which simulates vertical and horizontal inhibition with short term dynamics. The excellent research has been conducted by B. Strack, K. M. Jacobs and K. J. Cios. The SNN comprises four different types of neuron with known firing patterns. The model also incorporates a short-term synaptic plasticity to modify the synaptic strengths. The model has features that are 1403002-1

May 17, 2014 10:56 1403002

Int. J. Neur. Syst. 2014.24. Downloaded from www.worldscientific.com by MCMASTER UNIVERSITY on 02/10/15. For personal use only.

Introduction

close to biology in terms of (1) biologically accurate laminar and columnar flows of activity, (2) normal function of low-threshold spiking (LTS) and fast spiking (FS) neurons, and (3) ability to generate different stages of epileptiform activity. Incorporation of these unique characteristics allowed examining properties of cortex that were not previously modeled computationally. Having these characteristics the model allows modeling lesioned or malformed cortex. Sparse approximation is a hypothesized coding strategy where a population of sensory neurons encodes a stimulus using minimum active neurons. The third paper presents an optimal sparse approximation using an Integrate and Fire Neurons authored by S. Shapero, M. Zhu, P. Hasler and C. Rozell. It presents the spiking locally competitive algorithm (LCA), a SNN of integrate and fire neurons that calculate sparse approximations. The firing intensity of the spiking LCA is equivalent to an analogue dynamical system that converges on a sparse approximation in a finite time where the spike rate is an unbiased estimate of that intensity that limits the maximum variance of the estimation error. A network of 128 neurons encoding image patches demonstrates convergence of the network to nearly optimal encodings within 20 ms. When using more biophysically realistic parameters in the neurons, the gain function encourages additional sparsity in the encoding relative both to ideal neurons and digital solvers. The brain is characterized by the diverse processing capabilities ranging from simple pattern recognition, memory or decision making to filtering for image processing. Understanding the mechanisms of such a varied range of cortical operations remains a fundamental problem in neuroscience. The fourth paper studies the role of synchronized and chaotic spiking neural ensembles for neural information processing presented by J. L. Rossell´o, V. Canals, A. Oliver and A. Morro. It investigates the processes that are related to chaotic and synchronized states based on the study of in silico implementation of Stochastic Spiking Neural Networks (SSNN). It reveals that chaotic neural ensembles are excellent transmission and convolution systems. The synchronized cells (representing the ordered states of the brain) are associated to nonlinear computations and the mixed synchronized and chaotic states further demonstrate, as evident from the experimental investigations, fast complex pattern recognition processes. These features substantiate the role of neural synchrony in pattern recognition and the speed of the biological process. It suggests that some high-level adaptive mechanisms of the brain such as Hebbian and nonHebbian learning rules can be understood as the processes devoted to generate the appropriate clustering of both synchronized and chaotic ensembles. It also shows the possibility of implementing the stochastic neural model on hardware. Despite success of numerous models of spiking neural networks, accuracy, dynamics and implementations are still the major issues in spiking neural networks paradigm. The fifth paper presents a Generalized Leaky Integrateand-Fire (GLIF) neuron model with variable leaking resistor and bias current in order to reproduce accurately the membrane voltage dynamics of a biological neuron. The authors Z. Wang, L. Guo and M. Adjouadi also investigate the possibility of fast implementation of GLIF. The accuracy of GLIF model is ensured by adjusting its parameters to the statistical properties of the Hodgkin–Huxley model outputs while the speed is enhanced by introducing a Generalized Exponential Moving Average method that converts the parameterized kernel functions into pre-calculated lookup tables based on an analytic solution of the dynamic equations of the GLIF model. The comparison of the GLIF model performance with that of the NLIF shows that the proposed model provides much better calculation accuracy in simulating the biological neuron activity. Membrane systems (also called P systems) refer to the computing models abstracted from the function and structure of the living cell, the cooperation of cells in tissues, organs, and other populations of cells. Spiking neural P systems (SNPS) are a class of distributed and parallel computing models that incorporate the idea of spiking neurons into P systems. To attain the solution of optimization problems, P systems are used to properly organize evolutionary operators of heuristic approaches, which are named as membrane-inspired evolutionary algorithms. The last paper presents an application of SNPS for solving combinatorial optimization problems authored by G. Zhang, H. Rong, F. Neri, and M. J. Perez-Jimenez. This paper proposes an extended spiking neural P system (ESNPS) by introducing the probabilistic selection of evolution rules and multi-neurons output and a family of ESNPS, called optimization spiking neural P system (OSNPS). Extensive experiments on knapsack problems have been reported to experimentally prove the viability and effectiveness of the proposed ESNPS. Despite the 1403002-2

May 17, 2014 10:56 1403002

Introduction

Int. J. Neur. Syst. 2014.24. Downloaded from www.worldscientific.com by MCMASTER UNIVERSITY on 02/10/15. For personal use only.

facts that this work is the first attempt in this direction, the results appear promising and competitive when compared with ad hoc optimization algorithms. In conclusion, this special issue intends to provide its readers the current research trends, problems and research directions on the emerging areas of SNNs. A comprehensive review on SNNs in Ref. 1 provides a state-of-the-art review of the development of SNNs, and insight into their evolution. Applications of SNNs are still restricted to benchmark problems. Recent research has highlighted that astrocytes play a key role in the interaction and modulation of neuronal activity.2,3 The brain is a highly modular structure. To exploit modularity, it is necessary that spiking activity can propagate from one module to another.4,5 It is also known that increasing the strengths of recurrent connections shifts neural dynamics to a powerful computational regime.6 Future research must account for new biologically inspired information processing paradigms and move to much larger scale, and more biologically realistic implementations. The guest editors would like to thank the editor-in-chief Prof. H Adeli for all his continuous support and help for this special issue and for the series of special issues. We also acknowledge the expert help from the anonymous reviewers of this special issue. References 1. S. Ghosh-Dastidar and H. Adeli, Spiking neural networks, Int. J. Neural Syst. 19(4) (2009) 295–308. 2. C. Henneberger, T. Papouin, S. H. R. Oliet and D. A. Rusakov, Long-term potentiation depends on release of D-serine from astrocytes, Nature 463 (2010) 232–236. 3. M. Halassa and P. G. Haydon, Integrated brain circuits: Astrocytic networks modulate neuronal activity and behavior, Annu. Rev. Physiol. 72(7273) (2010) 335–355. 4. A. Kumar, S. Rotter and A Aertsen, Spiking activity propagation in neuronal networks: Reconciling different perspectives on neural coding, Nat. Rev. Neurosci. 11 (2010) 615–627. 5. S. F. Owen, S. N. Tuncdemir, P. L. Bader, N. N. Tirko, G. Fishell and R. W. Tsien, Oxytocin enhances hippocampal spike transmission by modulating fast-spiking interneurons, Nature 500 (2013) 458–462. 6. V. Goudar and D. V. Buonomano, Useful dynamic regimes emerge in recurrent networks, Nat. Neurosci. 17 (2014) 487–489.

Guest Editors Nazmul Siddique Bernard Widrow Liam Maguire

1403002-3

Special issue.

Special issue. - PDF Download Free
71KB Sizes 0 Downloads 5 Views