Visualization Techniques in SNN Simulators Hammouda Elbez, Kamel Benhaoua, Philippe Devienne, Pierre Boulet
Total Page:16
File Type:pdf, Size:1020Kb
Visualization Techniques in SNN Simulators Hammouda Elbez, Kamel Benhaoua, Philippe Devienne, Pierre Boulet To cite this version: Hammouda Elbez, Kamel Benhaoua, Philippe Devienne, Pierre Boulet. Visualization Techniques in SNN Simulators. 3rd International Conference on Multimedia Information Processing, CITIM’2018, Oct 2018, Mascara, Algeria. hal-02887481 HAL Id: hal-02887481 https://hal.archives-ouvertes.fr/hal-02887481 Submitted on 2 Jul 2020 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Visualization Techniques in SNN Simulators Hammouda Elbez, Kamel Benhaoua Philippe Devienne, Pierre Boulet University of Oran1 Ahmed Ben Bella -LAPECI- Univ. Lille, CNRS, Centrale Lille, UMR 9189 - CRIStAL - Mustapha Stambouli University Centre de Recherche en Informatique Signal et Automatique de Lille Mascara, Algeria F-59000 Lille, France fElbez.Hammouda, [email protected] fPhilippe.Devienne, [email protected] Abstract—Neural networks are one of the most well-known choice to use, can have an important influence on the user artificial intelligence technique. These networks have known a experience, the information that can be extracted from it huge evolution since their first proposal, represented by the and the ability to improve the performance of neuromorphic three known generations. On the other hand, neuromorphic architectures are a very promising way to overcome the limitation architectures. of the Von Neumann architecture and the end of Moore’s law. The aim of this work is to present the visualization tech- Neuromorphic architectures can lead to a huge energy consump- niques offered by the known SNN simulators and discuss tion decrease mostly due to the colocation of computation and their ability to produce visualizations suitable to analyze the storage. behaviour of the simulated networks in order to improve neu- These architectures implement spiking neural networks (SNNs), the third generation of artificial neural networks that romorphic architectures. This paper is structured as follows. model very finely biological neural networks. One of the main After a briefing introduction to neural networks and SNNs, problematics that prevents us from optimizing and getting the we present the state-of-the art of SNN simulators. We then best performance of SNNs and as a result the neuromorphic compare these simulators from the technical and visualization architectures development and production, is the lack of a clear points of view and discuss our findings. and complete understanding of their behavior, especially what makes learning efficient. One of the approaches to answer that II. NEURAL NETWORKS is analyzing by visualization of simulation traces of such networks and architectures. In this paper, we propose a comparison of the Neural networks are an architecture inspired from our brain, visualization techniques proposed by SNN simulators for analysis it contains a large number of simple interconnected units purposes. which simulate the neurons. Each unit (neuron) has multiple Index Terms—Neural Network, Data Science, Visualization, inputs and outputs neurons. The first generation of neural Simulation, Analysis, Neuromorphic Architectures, SNN. network was able to solve only linear problems, it is repre- sented by the formal mathematical models like McCulloch and I. INTRODUCTION Pitts neurons [1], Perceptron, Hopfield network and Boltzmann In the last years, neural networks have had a big impact machine. In the second generation where multi layer neural in many fields like image processing and decision making networks have been introduced, it was able to solve non-linear systems. Being able to understand and to model such networks problems by the use of continuous activation function, for was a key to achieve the remarkable performances which can example the MultiLayer Perceptron (MLP). be observed in the second generation of neural networks (Con- Being inspired from our brain, we can find the learning volutional Neural Networks, ConvNets). These performances term and this phenomenon exists also in this type of networks. come at the price of a huge energy consumption, especially In the life cycle of every neural network, there are mainly during learning. On the contrary, the brain is very power two phases which are: the learning (tuning) phase and the efficient, thus researchers have tried to design artificial neural use (production) phase. The learning phase consists of tuning networks whose behavior is more precisely modeled after the the network parameters using test dataset in order to get biological neural networks. These Spiking Neural Networks the best configuration with the lowest error rate and the use (SNNs) are indeed much more power efficient than ConvNets phase is the actual use of this network with real data. The and are suitable to electronic implementation, the so called learning in neural networks can be classified in more then one neuromorphic architectures. type, which are: supervised learning, unsupervised learning, Before any implementation, the need of passing by simu- half supervised learning and learning by strengthening. The lation is very important and for this type of networks, many supervised learning represents a learning way based on a simulators have been created and used over the last years to tested dataset that are labeled, which means that every entry get and test the suitable network configuration (or to simu- is composed from two information, the actual data and the late biological SNNs). Those simulators offer also a variety label that describes it. The unsupervised learning, this type of visualization techniques that can lead to very interesting does not provide a labeled learning dataset and learns by observations and improvements in network performance, the itself to classify the inputs into categories, this type is very combination of these visualization techniques and the best interesting in real life scenarios because it is very consuming to make a labeled dataset and having a network working with this type of learning increases its ability to adapt for many use cases, but may take more time in learning process compared to the first type of learning in order to achieve a good performance. Half supervised learning, this type exists because having labeled dataset is often a very consuming task in time and resources, that is why having a hybrid dataset (labeled and not labeled) can reduce negative points known in supervised and unsupervised types of learning (such as time and resource consumption). Learning by strengthening, which is a method used while the network is in the use phase, by keeping the tuning of this network up and running to Fig. 1. Neuron. increase its performance using the information that is actively processing. to process a big amount of data with a small number of A. Neuron spikes, they emulate data processing, plasticity and learning A neuron is the principal component of a neural network, in our brain. Spiking neural network is a network composed it has many input and output neurons. The functionality of of a big number of neurons connected by synapses (millions a single neuron is simple and useless alone, but the global of neurons and more synapses), the communication between activity of the million connected neurons is what makes those neurons are made by spikes that are considered as the it powerful. Scientists in biology, electronics and computer language of communication, the information is coded inside science field have tried to understand this component due a single spike and many neural information encoding have to its advantages and abilities. From the biological point of been proposed based on what is known in biology [5]. This view. Like it is shown in Fig. 1, a neuron is composed of: network has known the apparition of time notion, that is The membrane, which is considered as the core of the neuron considered as a very important factor in performance and where ions activity happens. Synapse, that is the interaction the information transfer process, some scientists have even space between two neurons to transfer information, this space proposed that time can actually hold information in this type uses liquid channels and ions for transportation. Dendrite, of network and considered as information coding technique which represent the element that is responsible for transmitting [6]. One of the reasons why this type of network has got the information after being received by the synapse into the attention lately, is the discovery of the memristor, this element membrane. Axone, that is the output support, used by the is capable of emulating the functionality of the synapse and neuron to send out the information, it can vary in length from helps implementing such networks in hardware which was not some millimeters to many meters. easy to implement before. [7] In the electronics field, scientists have tried to model such a component for use in hardware due to the advantages it offers, III. SNN SIMULATORS this type of architectures which are called neuromorphic are Simulation