ARTIFICIAL NEURAL NETWORKS: A REVIEW OF TRAINING TOOLS Darío Baptista, Fernando Morgado-Dias Madeira Interactive Technologies Institute and Centro de Competências de Ciências Exactas e da Engenharia, Universidade da Madeira Campus da Penteada, 9000-039 Funchal, Madeira, Portugal. Tel:+351 291-705150/1, Fax: +351 291-705199 Abstract: Artificial Neural Networks became a common solution for a wide variety of problems in many fields. The most frequent solution for its implementation consists of building and training the Artificial Neural Network within a computer. For implementing a network in an efficient way, the user can access a large choice of software solutions either commercial or prototypes. Choosing the most convenient solution for the application according to the network architecture, training algorithm, operating system and price can be a complex task. This paper helps the Artificial Neural Network user by providing a large list of solution available and explaining their characteristics and terms of use. The paper is confined to reporting the software products that have been developed for Artificial Neural Networks. The features considered important for this kind of software to have in order to accommodate its users effectively are specified. The development of software that implements Artificial Neural is a rapidly growing field driven by strong research interests as well as urgent practical, economical and social needs. Copyright CONTROLO2012 Keywords: Artificial Neural Networks, Training Tools, Training Algorithms, Software. 1. INTRODUCTION commercialization of new ANN tools. With the purpose to inform which tools are available at present Nowadays, in different areas, it is important to and make the choice of which tool to use, this paper analyse nonlinear data to do prediction, classification contains the description of software that has been or to build models. For this purpose, a nonlinear developed specifically for ANN, independently of model such as the Artificial Neural Network (ANN) the language used. It is important to refer that some can be used. An ANN is an interconnection of tools were created and used for academic studies and neurons to build a network, whose inspiration comes others are commercial and sophisticated tools, which from the natural neural networks that compose the help the mainstream business users take decisions brain. The processing capacity of the ANN is stored about their company. in the weights of the connections and they are The paper does not cover ANN such as spiking obtained, in the most common case of supervised neurons and other models that closely biologically training, by a learning process that uses examples or inspired. training patterns [1]. Each year more ANN tools are being used for 2. ARTIFICIAL NEURAL NETWORKS TOOLS data collecting, not only for academic studies, but also to manage decisions in some large and mid- Most of the ANN’s tools are documented through range companies because this data holds valuable scientific papers [2-12] or websites [13-58] with information, e.g., trends and patterns, which could be enough information for the user to make an informed used to improve business decisions and optimize decision of whether to test and use the proposed tool success. This need for an automated extraction of or to search for another one that fits its needs. useful knowledge from enormous quantity of data is widely recognized, which leads to creations and 1.1 System Requirement Table 1 List the operating system for each tool. The most common set of requirements defined by OS Mac Window Unix Linux Sun Others any tool is the operating system. To be used Tools OS X efficiently, each tool needs an appropriate Operating AiNet (3.1 to NT) System. Table 1 shows the recommended operating system for the proper functioning. It also shows that Annie (XP or later) the majority of the tools use Windows as the Aspirin/Migraines operating system. Outside the Microsoft universe, the Basis-of-AI-NN operating system most often used for the realization (3.1) of an ANN’s tool is the Mac OS. The operating EasyNN (all version) system requirements of some of the tools are often Encog accompanied by hardware and software minimum (all version) characteristics. These requirements correspond to the FANN (all version) elements that are necessary to be installed on the FastICA any OS (need Matlab platform - tool constituted by m files) computer to provide optimal functioning of tool. It is FuNeGen important to remind that these software pre-requisites (all version) Fuzzy ART/ Fuzzy are not included in the tool installation package and any OS (need Matlab platform - tool constituted by m files) ARTMAP need to be installed separately before installing the GENESIS 2.3 required tool. JATTON Table 2 shows the various aspects of software and (all version) hardware requirements necessary for each tool. It is Java NNS (NT or later) (OS 7) important to remind the user that exceeding by far these requirements does not guarantee to the user that Joone (all version) everything will run with absolute smoothness and Lens (all version) look its best. LM-MLP (95 or later) Table 2. List of software requirements Multiple HD Video Backpropagation Tools RAM Processor JRE Free Space Resolution Color NNFit AIX; IRIX (OS 4.1.4) AiNet 4MB 5MB NNSYSID any OS (need Matlab platform - tool constituted by m files) 1.5 or Encog NNT any OS (need Matlab platform - tool constituted by m files) later Intel and GENESIS 2.3 NetLab any OS (need Matlab platform - tool constituted by m files) AMD 64 bit 1.6 or JATTON NeuroModeler later (95 to XP) 1.3 or Java NNS Neural Network later Toolbox any OS (need Matlab platform - tool constituted by m files) Joone 256MB NeuroIntelligence LM-MLP 32MB Pentium I or II (98or later) Neuroph Nenet 16 bits NetLab 456KB NeuroSolutions Processor (XP or later) NeuroModeler 128MB Speed: 1GHz AIX ; Nest NeuroIntelligence 128MB 15MB 800x600 8 bits Pentium II (10.3) (Solaris) SGI; 1.6 or Neuroph NevProp later NeuroSolutions 100MB 1024x768 Pentium IV Nico Toolkit 1.4 or Nico Toolkit later Nuclass (NT or later) Nuclass 32MB 25MB Numap Numap 32MB 25MB (NT or later) Pythia 32 MB 1.39 MB Pentium PDP++ (known as SGI Emergent) (95 or later) SOM 64 MB 1 MB Processor Statistica 1GB 172.84MB Pythia Speed: 2GHz (95 or later) 1.5 or Weka Ultrix; later SNNS (OS 4;5) AIX; IRIX There is not any specification. SOM any OS (need Matlab platform - tool constituted by m files) SOM_PAK - LVQ_PAK 1.2. Network architectures for each neural tools. Statistica (NT or later) Table 3 and Table 4 show the networks architecture Torch (*) FreeBSD and the training algorithm for each tool. It can be seen that the networks more usual are Radial Basis Trajan Function, Multiplayer Perceptron and Self- Uts IRIX (OS 5.3) Organizing Map and the training algorithm more usual is the Backpropagation. WEKA (10.6) AIX;Ultrix; XNBC (95 to XP) Hpux (*) is not yet Windows compatible, coming soon; There is; There is not. Table 3. List the network architecture for each tool. Legend: Tools Arquitecture N1 – Arbitrarily connected neuron; AiNet N38 N2 – Adaline Linear Neuron; N19; N20; N22; N27; N32; N38; Annie N47 N3 – Autoassociative Memory; Aspirin/Migraines N13; N43 N4 – Adaptive Resonance Theory; Basis-of-AI-NN N4; N6; N9; N20; N22; N27; N29 N5 – Adaptive Resonance Theory Mapping N6 – Bidirectional Associative Memory; EasyNN N38 N7 – BayesNet/Bayesian Network; Encog N2; N4; N6; N9; N16; N19; N27; N31; N32; N38; N42; N47; N50 N8 – Bipolar network; FANN N20 N9 – Boltzmann Machine FastICA N30 N10 – Bridged Multiplayer Perceptron; N11 – Back-Propagation network; FuNeGen N12 N12 – Coactive Network – Fuzzy Interference Fuzzy ART/ Fuzzy ARTMAP N4; N5 System; GENESIS 2.3 N20; N26 N13 – Canonical Discriminants Analysis JATTON N11; N16; N32 N14 – Convolutional Network; Java NNS N20 N15 – Competitive neural network; N16 – Counter-Propagation neural network; Joone N20; N32; N39; N43; N52 N17 – Deep Boltzmann Machines/Mean-field; Lens N9; N20; N22; N29; N32 N18 – Dynamic network; LM-MLP N38 N19 – Elman Network; Multiple Backpropagation N20; N38 N20 – Feed-Forward neural network; NNFit N38 N21 – Functional Link Network; N22 – Recurrent network; NNSYSID N20; N38 N23 – Generalized regression neural network; NNT N1; N8; N10; N38 N24 – Hebbian Network; NetLab N20; N24; N35; N38; N47; N50 N25 – Hybrid Models NeuroModeler N2; N4; N6; N9; N16; N38; N42 N26 – Hodgkin & Huxley Neural Network; N27 – Hopfield Network; Neural Network Toolbox N2; N18; N19; N22; N27; N33; N38; N47; N50; N52 N28 – Integrate and fire NeuroIntelligence N11; N38 N29 – Interactive Activation Network; N2; N6; N15; N24; N27; N32; N34; N30– Independent component analysis; Neuroph N38; N47 N31 – Jordan Recurrent NeuroSolutions N12; N19; N20; N22; N23; N38; N39; N46; N47; N50; N51 N32 – Kohonen networks; Nest N4; N28 N33 – Learning Vector Quantization nets; NevProp N20; N38 N34 – Maxnet; N35 – Mixture Density Network; Nico Toolkit N18; N38; N47 N36 – Multidimensional Scaling; Nuclass N38; N45; N50 N37 – Mixture of Gaussians Networks; Numap N38; N45; N50 N38 – Multiplayer Perceptron; N9; N17; N20; N24; N27; N27; N39 – Modular neural Network; PDP++ (also known as Emergent) N36; N40; N43; N49; N50 N40 – Mixture of Experts; N11; N38 Pythia N41 – Neocognitron Neural Network; N3; N4; N6; N16; N19; N27; N33; SNNS N44; N50; N22; N52 N42 – Neuroevolution of Augmenting Topologies SOM N43; N50 N43 – Principal Component Analysis; N44 – Pruned Cascade-Correlation; SOM_PAK - LVQ_PAK N33; N50 N45 – Piecewise linear network Statistica N23; N38; N41; N46; N47; N50 N46 – Probabilistic Neural Network; Torch N14; N38; N40; N47; N52 N47 – Radial Basis Function; Trajan N23; N38; N46; N47; N50 N48 – Space Mapping Uts N37; N50 N49 – Stochastic Neural Network; N50 – Self-Organizing Map; WEKA N7; N38; N47 N51 – Support Vector Machine; XNBC N26; N28 N52– Time-Delay neural Network; Table 4.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-