EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Improving Associative Memory in a Network of Spiking Neurons

Download or read book Improving Associative Memory in a Network of Spiking Neurons written by Russell I. Hunter and published by . This book was released on 2011 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: In this thesis we use computational neural network models to examine the dynamics and functionality of the CA3 region of the mammalian hippocampus. The emphasis of the project is to investigate how the dynamic control structures provided by inhibitory circuitry and cellular modification may effect the CA3 region during the recall of previously stored information. The CA3 region is commonly thought to work as a recurrent auto-associative neural network due to the neurophysiological characteristics found, such as, recurrent collaterals, strong and sparse synapses from external inputs and plasticity between coactive cells. Associative memory models have been developed using various configurations of mathematical artificial neural networks which were first developed over 40 years ago. Within these models we can store information via changes in the strength of connections between simplified model neurons (two-state). These memories can be recalled when a cue (noisy or partial) is instantiated upon the net. The type of information they can store is quite limited due to restrictions caused by the simplicity of the hard-limiting nodes which are commonly associated with a binary activation threshold. We build a much more biologically plausible model with complex spiking cell models and with realistic synaptic properties between cells. This model is based upon some of the many details we now know of the neuronal circuitry of the CA3 region. We implemented the model in computer software using Neuron and Matlab and tested it by running simulations of storage and recall in the network. By building this model we gain new insights into how different types of neurons, and the complex circuits they form, actually work. The mammalian brain consists of complex resistive-capacative electrical circuitry which is formed by the interconnection of large numbers of neurons. A principal cell type is the pyramidal cell within the cortex, which is the main information processor in our neural networks. Pyramidal cells are surrounded by diverse populations of interneurons which have proportionally smaller numbers compared to the pyramidal cells and these form connections with pyramidal cells and other inhibitory cells. By building detailed computational models of recurrent neural circuitry we explore how these microcircuits of interneurons control the flow of information through pyramidal cells and regulate the efficacy of the network. We also explore the effect of cellular modification due to neuronal activity and the effect of incorporating spatially dependent connectivity on the network during recall of previously stored information. In particular we implement a spiking neural network proposed by Sommer and Wennekers (2001). We consider methods for improving associative memory recall using methods inspired by the work by Graham and Willshaw (1995) where they apply mathematical transforms to an artificial neural network to improve the recall quality within the network. The networks tested contain either 100 or 1000 pyramidal cells with 10% connectivity applied and a partial cue instantiated, and with a global pseudo-inhibition. We investigate three methods. Firstly, applying localised disynaptic inhibition which will proportionalise the excitatory post synaptic potentials and provide a fast acting reversal potential which should help to reduce the variability in signal propagation between cells and provide further inhibition to help synchronise the network activity. Secondly, implementing a persistent sodium channel to the cell body which will act to non-linearise the activation threshold where after a given membrane potential the amplitude of the excitatory postsynaptic potential (EPSP) is boosted to push cells which receive slightly more excitation (most likely high units) over the firing threshold. Finally, implementing spatial characteristics of the dendritic tree will allow a greater probability of a modified synapse existing after 10% random connectivity has been applied throughout the network. We apply spatial characteristics by scaling the conductance weights of excitatory synapses which simulate the loss in potential in synapses found in the outer dendritic regions due to increased resistance. To further increase the biological plausibility of the network we remove the pseudo-inhibition and apply realistic basket cell models with differing configurations for a global inhibitory circuit. The networks are configured with; 1 single basket cell providing feedback inhibition, 10% basket cells providing feedback inhibition where 10 pyramidal cells connect to each basket cell and finally, 100% basket cells providing feedback inhibition. These networks are compared and contrasted for efficacy on recall quality and the effect on the network behaviour. We have found promising results from applying biologically plausible recall strategies and network configurations which suggests the role of inhibition and cellular dynamics are pivotal in learning and memory.

Book A Spiking Bidirectional Associative Memory Neural Network

Download or read book A Spiking Bidirectional Associative Memory Neural Network written by Melissa Johnson and published by . This book was released on 2021 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Spiking neural networks (SNNs) are a more biologically realistic model of the brain than traditional analog neural networks and therefore should be better for modelling certain functions of the human brain. This thesis uses the concept of deriving an SNN from an accepted non-spiking neural network via analysis and modifications of the transmission function. We investigate this process to determine if and how the modifications can be made to minimize loss of information during the transition from non-spiking to spiking while retaining positive features and functionality of the non-spiking network. By comparing combinations of spiking neuron models and networks against each other, we determined that replacing the transmission function with a neural model that is similar to it allows for the easiest method to create a spiking neural network that works comparatively well. This similarity between transmission function and neuron model allows for easier parameter selection which is a key component in getting a functioning SNN. The parameters all play different roles, but for the most part, parameters that speed up spiking, such as large resistance values or small rheobases generally help the accuracy of the network. But the network is still incomplete for a spiking neural network since this conversion is often only performed after learning has been completed in analog form. The neuron model and subsequent network developed here are the initial steps in creating a bidirectional SNN that handles hetero-associative and auto-associative recall and can be switched easily between spiking and non-spiking with minimal to no loss of data. By tying everything to the transmission function, the non-spiking learning rule, which in our case uses the transmission function, and the neural model of the SNN, we are able to create a functioning SNN. Without this similarity, we find that creating SNN are much more complicated and require much more work in parameter optimization to achieve a functioning SNN.

Book Dynamic Brain   from Neural Spikes to Behaviors

Download or read book Dynamic Brain from Neural Spikes to Behaviors written by Maria Marinaro and published by Springer Science & Business Media. This book was released on 2008-10-23 with total page 149 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book is devoted to graduate students and researchers with different scientific background (including physics, mathematics, biology, neuroscience, etc.) who wish to learn brain science beyond the boundary of their fields. The volume presents 12 thoroughly revised tutorial papers based on lectures given by leading researchers at the 12th International Summer School on Neural Networks in Erice, Italy, in December 2007. The 12 invited and contributed papers presented provide primarily high-level tutorial coverage of the fields related to neuraldynamics, reporting recent experimental and theoretical results investigating the role of collective dynamics in hippocampal and parahippocampal regions and in the mammalian olfactory system. The book is divided into topical sections on hippocampus and neural oscillations, dynamics in olfactory system and behaviour, correlation structure of spiking trains, and neural network theories on associative memory.

Book Artificial Neural Networks   ICANN 2008

Download or read book Artificial Neural Networks ICANN 2008 written by Věra Kůrková and published by Springer Science & Business Media. This book was released on 2008 with total page 1012 pages. Available in PDF, EPUB and Kindle. Book excerpt: This two volume set LNCS 5163 and LNCS 5164 constitutes the refereed proceedings of the 18th International Conference on Artificial Neural Networks, ICANN 2008, held in Prague Czech Republic, in September 2008. The 200 revised full papers presented were carefully reviewed and selected from more than 300 submissions. The second volume is devoted to pattern recognition and data analysis, hardware and embedded systems, computational neuroscience, connectionistic cognitive science, neuroinformatics and neural dynamics. it also contains papers from two special sessions coupling, synchronies, and firing patterns: from cognition to disease, and constructive neural networks and two workshops new trends in self-organization and optimization of artificial neural networks, and adaptive mechanisms of the perception-action cycle.

Book Implementation of Associative Memory with Online Learning Into a Spiking Neural Network on Neuromorphic Hardware

Download or read book Implementation of Associative Memory with Online Learning Into a Spiking Neural Network on Neuromorphic Hardware written by Michael James Hampo and published by . This book was released on 2020 with total page 46 pages. Available in PDF, EPUB and Kindle. Book excerpt: Implementing cognitive algorithms on robots is one potential direction to realize autonomous artificial agents. There is an effort to push robotics and artificial intelligence into many aspects of daily life. An important step in this process is leveraging concepts known to work from human cognitive features on computer systems to improve the performance of robotic systems. Spiking Neural Networks (SNNs) allow these computational models to be instantiated in a low size, weight, and power (SWaP) form factor due to the biological efficiencies they approximate. This paper shows an associative memory in the form of an SNN, an application of the associative memory, and some performance benchmarking. The model is created using a neural network simulator and run on a low SWaP CPU and Intel's neuromorphic processor, Loihi, an artificial intelligence accelerator highly optimized for spiking neural algorithms. In addition, the model is employed on a mobile robotic platform that explores the real world and uses online learning to make associations. When the model was run on Loihi the overall power usage decreased as well as the run time of the simulation as compared to the low SWaP CPU proving beneficial to implement the neuromorphic hardware.

Book Hippocampal Microcircuits

    Book Details:
  • Author : Vassilis Cutsuridis
  • Publisher : Springer Science & Business Media
  • Release : 2010-02-01
  • ISBN : 1441909966
  • Pages : 619 pages

Download or read book Hippocampal Microcircuits written by Vassilis Cutsuridis and published by Springer Science & Business Media. This book was released on 2010-02-01 with total page 619 pages. Available in PDF, EPUB and Kindle. Book excerpt: Rich in detail, Hippocampal Microcircuits: A Computational Modeler’s Resource Book provides succinct and focused reviews of experimental results. It is an unparalleled resource of data and methodology that will be invaluable to anyone wishing to develop computational models of the microcircuits of the hippocampus. The editors have divided the material into two thematic areas. Covering the subject’s experimental background, leading neuroscientists discuss the morphological, physiological and molecular characteristics as well as the connectivity and synaptic properties of the various cell types found in the hippocampus. Here, ensemble activity, related to behavior, on the part of morphologically identified neurons in anesthetized and freely moving animals, lead to insights into the functions of hippocampal areas. In the second section, on computational analysis, computational neuroscientists present models of hippocampal microcircuits at various levels of detail, including single-cell and network levels. A full chapter is devoted to the single-neuron and network simulation environments currently used by computational neuroscientists in developing their models. In addition to the above, the chapters also identify outstanding questions and areas in need of further clarification that will guide future research by computational neuroscientists.

Book Associative Memory Cells  Basic Units of Memory Trace

Download or read book Associative Memory Cells Basic Units of Memory Trace written by Jin-Hui Wang and published by Springer Nature. This book was released on 2019-09-10 with total page 275 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book focuses on associative memory cells and their working principles, which can be applied to associative memories and memory-relevant cognitions. Providing comprehensive diagrams, it presents the author's personal perspectives on pathology and therapeutic strategies for memory deficits in patients suffering from neurological diseases and psychiatric disorders. Associative learning is a common approach to acquire multiple associated signals, including knowledge, experiences and skills from natural environments or social interaction. The identification of the cellular and molecular mechanisms underlying associative memory is important in furthering our understanding of the principles of memory formation and memory-relevant behaviors as well as in developing therapeutic strategies that enhance memory capacity in healthy individuals and improve memory deficit in patients suffering from neurological disease and psychiatric disorders. Although a series of hypotheses about neural substrates for associative memory has been proposed, numerous questions still need to be addressed, especially the basic units and their working principle in engrams and circuits specific for various memory patterns. This book summarizes the developments concerning associative memory cells reported in current and past literature, providing a valuable overview of the field for neuroscientists, psychologists and students.

Book Parallel Models of Associative Memory

Download or read book Parallel Models of Associative Memory written by Geoffrey E. Hinton and published by Psychology Press. This book was released on 2014-02-25 with total page 350 pages. Available in PDF, EPUB and Kindle. Book excerpt: This update of the 1981 classic on neural networks includes new commentaries by the authors that show how the original ideas are related to subsequent developments. As researchers continue to uncover ways of applying the complex information processing abilities of neural networks, they give these models an exciting future which may well involve revolutionary developments in understanding the brain and the mind -- developments that may allow researchers to build adaptive intelligent machines. The original chapters show where the ideas came from and the new commentaries show where they are going.

Book Artificial Neural Networks   ICANN 2008

Download or read book Artificial Neural Networks ICANN 2008 written by Vera Kurkova-Pohlova and published by Springer. This book was released on 2008-09-08 with total page 1053 pages. Available in PDF, EPUB and Kindle. Book excerpt: This two volume set LNCS 5163 and LNCS 5164 constitutes the refereed proceedings of the 18th International Conference on Artificial Neural Networks, ICANN 2008, held in Prague Czech Republic, in September 2008. The 200 revised full papers presented were carefully reviewed and selected from more than 300 submissions. The first volume contains papers on mathematical theory of neurocomputing, learning algorithms, kernel methods, statistical learning and ensemble techniques, support vector machines, reinforcement learning, evolutionary computing, hybrid systems, self-organization, control and robotics, signal and time series processing and image processing.

Book Spike timing dependent plasticity

Download or read book Spike timing dependent plasticity written by Henry Markram and published by Frontiers E-books. This book was released on with total page 575 pages. Available in PDF, EPUB and Kindle. Book excerpt: Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when two neurons are active together. Neurons that fire together may therefore not necessarily wire together if the precise timing of the spikes involved are not tighly correlated. In the subsequent 15 years, Spike Timing Dependent Plasticity (STDP) has been found in multiple brain brain regions and in many different species. The size and shape of the time windows in which positive and negative changes can be made vary for different brain regions, but the core principle of spike timing dependent changes remain. A large number of theoretical studies have also been conducted during this period that explore the computational function of this driving principle and STDP algorithms have become the main learning algorithm when modeling neural networks. This Research Topic will bring together all the key experimental and theoretical research on STDP.

Book Computational Models of Brain and Behavior

Download or read book Computational Models of Brain and Behavior written by Ahmed A. Moustafa and published by John Wiley & Sons. This book was released on 2017-09-11 with total page 588 pages. Available in PDF, EPUB and Kindle. Book excerpt: A comprehensive Introduction to the world of brain and behavior computational models This book provides a broad collection of articles covering different aspects of computational modeling efforts in psychology and neuroscience. Specifically, it discusses models that span different brain regions (hippocampus, amygdala, basal ganglia, visual cortex), different species (humans, rats, fruit flies), and different modeling methods (neural network, Bayesian, reinforcement learning, data fitting, and Hodgkin-Huxley models, among others). Computational Models of Brain and Behavior is divided into four sections: (a) Models of brain disorders; (b) Neural models of behavioral processes; (c) Models of neural processes, brain regions and neurotransmitters, and (d) Neural modeling approaches. It provides in-depth coverage of models of psychiatric disorders, including depression, posttraumatic stress disorder (PTSD), schizophrenia, and dyslexia; models of neurological disorders, including Alzheimer’s disease, Parkinson’s disease, and epilepsy; early sensory and perceptual processes; models of olfaction; higher/systems level models and low-level models; Pavlovian and instrumental conditioning; linking information theory to neurobiology; and more. Covers computational approximations to intellectual disability in down syndrome Discusses computational models of pharmacological and immunological treatment in Alzheimer's disease Examines neural circuit models of serotonergic system (from microcircuits to cognition) Educates on information theory, memory, prediction, and timing in associative learning Computational Models of Brain and Behavior is written for advanced undergraduate, Master's and PhD-level students—as well as researchers involved in computational neuroscience modeling research.

Book Neural Engineering

Download or read book Neural Engineering written by Chris Eliasmith and published by MIT Press. This book was released on 2003 with total page 384 pages. Available in PDF, EPUB and Kindle. Book excerpt: A synthesis of current approaches to adapting engineering tools to the study of neurobiological systems.

Book The Role of Synaptic Tagging and Capture for Memory Dynamics in Spiking Neural Networks

Download or read book The Role of Synaptic Tagging and Capture for Memory Dynamics in Spiking Neural Networks written by Jannik Luboeinski and published by . This book was released on 2021-09-02 with total page 201 pages. Available in PDF, EPUB and Kindle. Book excerpt: Memory serves to process and store information about experiences such that this information can be used in future situations. The transfer from transient storage into long-term memory, which retains information for hours, days, and even years, is called consolidation. In brains, information is primarily stored via alteration of synapses, so-called synaptic plasticity. While these changes are at first in a transient early phase, they can be transferred to a late phase, meaning that they become stabilized over the course of several hours. This stabilization has been explained by so-called synaptic tagging and capture (STC) mechanisms. To store and recall memory representations, emergent dynamics arise from the synaptic structure of recurrent networks of neurons. This happens through so-called cell assemblies, which feature particularly strong synapses. It has been proposed that the stabilization of such cell assemblies by STC corresponds to so-called synaptic consolidation, which is observed in humans and other animals in the first hours after acquiring a new memory. The exact connection between the physiological mechanisms of STC and memory consolidation remains, however, unclear. It is equally unknown which influence STC mechanisms exert on further cognitive functions that guide behavior. On timescales of minutes to hours (that means, the timescales of STC) such functions include memory improvement, modification of memories, interference and enhancement of similar memories, and transient priming of certain memories. Thus, diverse memory dynamics may be linked to STC, which can be investigated by employing theoretical methods based on experimental data from the neuronal and the behavioral level. In this thesis, we present a theoretical model of STC-based memory consolidation in recurrent networks of spiking neurons, which are particularly suited to reproduce biologically realistic dynamics. Furthermore, we combine the STC mechanisms with calcium dynamics, which have been found to guide the major processes of early-phase synaptic plasticity in vivo. In three included research articles as well as additional sections, we develop this model and investigate how it can account for a variety of behavioral effects. We find that the model enables the robust implementation of the cognitive memory functions mentioned above. The main steps to this are: 1. demonstrating the formation, consolidation, and improvement of memories represented by cell assemblies, 2. showing that neuromodulator-dependent STC can retroactively control whether information is stored in a temporal or rate-based neural code, and 3. examining interaction of multiple cell assemblies with transient and attractor dynamics in different organizational paradigms. In summary, we demonstrate several ways by which STC controls the late-phase synaptic structure of cell assemblies. Linking these structures to functional dynamics, we show that our STC-based model implements functionality that can be related to long-term memory. Thereby, we provide a basis for the mechanistic explanation of various neuropsychological effects. Keywords: synaptic plasticity; synaptic tagging and capture; spiking recurrent neural networks; memory consolidation; long-term memory

Book Principles of Computational Modelling in Neuroscience

Download or read book Principles of Computational Modelling in Neuroscience written by David Sterratt and published by Cambridge University Press. This book was released on 2023-10-05 with total page 553 pages. Available in PDF, EPUB and Kindle. Book excerpt: Learn to use computational modelling techniques to understand the nervous system at all levels, from ion channels to networks.

Book Neuromorphic Cognitive Systems

Download or read book Neuromorphic Cognitive Systems written by Qiang Yu and published by Springer. This book was released on 2017-05-03 with total page 180 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book presents neuromorphic cognitive systems from a learning and memory-centered perspective. It illustrates how to build a system network of neurons to perform spike-based information processing, computing, and high-level cognitive tasks. It is beneficial to a wide spectrum of readers, including undergraduate and postgraduate students and researchers who are interested in neuromorphic computing and neuromorphic engineering, as well as engineers and professionals in industry who are involved in the design and applications of neuromorphic cognitive systems, neuromorphic sensors and processors, and cognitive robotics. The book formulates a systematic framework, from the basic mathematical and computational methods in spike-based neural encoding, learning in both single and multi-layered networks, to a near cognitive level composed of memory and cognition. Since the mechanisms for integrating spiking neurons integrate to formulate cognitive functions as in the brain are little understood, studies of neuromorphic cognitive systems are urgently needed. The topics covered in this book range from the neuronal level to the system level. In the neuronal level, synaptic adaptation plays an important role in learning patterns. In order to perform higher-level cognitive functions such as recognition and memory, spiking neurons with learning abilities are consistently integrated, building a system with encoding, learning and memory functionalities. The book describes these aspects in detail.

Book Adaptive and Natural Computing Algorithms

Download or read book Adaptive and Natural Computing Algorithms written by Ville Kolehmainen and published by Springer. This book was released on 2009-09-30 with total page 645 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book constitutes the thoroughly refereed post-proceedings of the 9th International Conference on Adaptive and Natural Computing Algorithms, ICANNGA 2009, held in Kuopio, Finland, in April 2009. The 63 revised full papers presented were carefully reviewed and selected from a total of 112 submissions. The papers are organized in topical sections on neutral networks, evolutionary computation, learning, soft computing, bioinformatics as well as applications.

Book Special Topics in Information Technology

Download or read book Special Topics in Information Technology written by Barbara Pernici and published by Springer Nature. This book was released on 2019-10-01 with total page 135 pages. Available in PDF, EPUB and Kindle. Book excerpt: This open access book presents nine outstanding doctoral dissertations in Information Technology from the Department of Electronics, Information and Bioengineering, Politecnico di Milano, Italy. Information Technology has always been highly interdisciplinary, as many aspects have to be considered in IT systems. The doctoral studies program in IT at Politecnico di Milano emphasizes this interdisciplinary nature, which is becoming more and more important in recent technological advances, in collaborative projects, and in the education of young researchers. Accordingly, the focus of advanced research is on pursuing a rigorous approach to specific research topics starting from a broad background in various areas of Information Technology, especially Computer Science and Engineering, Electronics, Systems and Controls, and Telecommunications. Each year, more than 50 PhDs graduate from the program. This book gathers the outcomes of the nine best theses defended in 2018-19 and selected for the IT PhD Award. Each of the nine authors provides a chapter summarizing his/her findings, including an introduction, description of methods, main achievements and future work on the topic. Hence, the book provides a cutting-edge overview of the latest research trends in Information Technology at Politecnico di Milano, presented in an easy-to-read format that will also appeal to non-specialists.