5
5.0

Jun 26, 2020
06/20

by
Barret Zoph, Quoc V. Le

texts

######
eye 5

######
favorite 0

######
comment 0

Neural networks are powerful and flexible models that work well for many difficult learning tasks in image, speech and natural language understanding. Despite their success, neural networks are still hard to design. In this paper, we use a recurrent network to generate the model descriptions of neural networks and train this RNN with reinforcement learning to maximize the expected accuracy of the generated architectures on a validation set. On the CIFAR-10 dataset, our method, starting from...

Topics: Artificial Intelligence, Neural and Evolutionary Computing

7
7.0

Jun 28, 2018
06/18

by
Jaderick P. Pabico; Elizer A. Albacea

texts

######
eye 7

######
favorite 0

######
comment 0

The complex effect of genetic algorithm's (GA) operators and parameters to its performance has been studied extensively by researchers in the past but none studied their interactive effects while the GA is under different problem sizes. In this paper, We present the use of experimental model (1)~to investigate whether the genetic operators and their parameters interact to affect the offline performance of GA, (2)~to find what combination of genetic operators and parameter settings will provide...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1508.00097

2
2.0

Jun 30, 2018
06/18

by
Alexander Hagg

texts

######
eye 2

######
favorite 0

######
comment 0

Evolutionary illumination is a recent technique that allows producing many diverse, optimal solutions in a map of manually defined features. To support the large amount of objective function evaluations, surrogate model assistance was recently introduced. Illumination models need to represent many more, diverse optimal regions than classical surrogate models. In this PhD thesis, we propose to decompose the sample set, decreasing model complexity, by hierarchically segmenting the training set...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1703.09926

5
5.0

Jun 29, 2018
06/18

by
Andrew Pulver; Siwei Lyu

texts

######
eye 5

######
favorite 0

######
comment 0

Previous RNN architectures have largely been superseded by LSTM, or "Long Short-Term Memory". Since its introduction, there have been many variations on this simple design. However, it is still widely used and we are not aware of a gated-RNN architecture that outperforms LSTM in a broad sense while still being as simple and efficient. In this paper we propose a modified LSTM-like architecture. Our architecture is still simple and achieves better performance on the tasks that we tested...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1605.01988

3
3.0

Jun 29, 2018
06/18

by
Leonard Johard; Lukas Breitwieser; Alberto Di Meglio; Marco Manca; Manuel Mazzara; Max Talanov

texts

######
eye 3

######
favorite 0

######
comment 0

This paper is a brief update on developments in the BioDynaMo project, a new platform for computer simulations for biological research. We will discuss the new capabilities of the simulator, important new concepts simulation methodology as well as its numerous applications to the computational biology and nanoscience communities.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1608.01818

2
2.0

Jun 30, 2018
06/18

by
Paul Szerlip; Kenneth O. Stanley

texts

######
eye 2

######
favorite 0

######
comment 0

To address the difficulty of creating online collaborative evolutionary systems, this paper presents a new prototype library called Worldwide Infrastructure for Neuroevolution (WIN) and its accompanying site WIN Online (http://winark.org/). The WIN library is a collection of software packages built on top of Node.js that reduce the complexity of creating fully persistent, online, and interactive (or automated) evolutionary platforms around any domain. WIN Online is the public interface for WIN,...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1407.3000

3
3.0

Jun 30, 2018
06/18

by
Norbert Michael Mayer

texts

######
eye 3

######
favorite 0

######
comment 0

Recurrent networks with transfer functions that fulfill the Lipschitz continuity with K=1 may be echo state networks if certain limitations on the recurrent connectivity are applied. It has been shown that it is sufficient if the largest singular value of the recurrent connectivity is smaller than 1. The main achievement of this paper is a proof under which conditions the network is an echo state network even if the largest singular value is one. It turns out that in this critical case the...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1411.6757

7
7.0

Jun 28, 2018
06/18

by
Subhrajit Roy; Phyo Phyo San; Shaista Hussain; Lee Wang Wei; Arindam Basu

texts

######
eye 7

######
favorite 0

######
comment 0

In this paper, a neuron with nonlinear dendrites (NNLD) and binary synapses that is able to learn temporal features of spike input patterns is considered. Since binary synapses are considered, learning happens through formation and elimination of connections between the inputs and the dendritic branches to modify the structure or "morphology" of the NNLD. A morphological learning algorithm inspired by the 'Tempotron', i.e., a recently proposed temporal learning algorithm-is presented...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1506.05212

2
2.0

Jun 28, 2018
06/18

by
Nicholas Léonard; Sagar Waghmare; Yang Wang; Jin-Hwa Kim

texts

######
eye 2

######
favorite 0

######
comment 0

The rnn package provides components for implementing a wide range of Recurrent Neural Networks. It is built withing the framework of the Torch distribution for use with the nn package. The components have evolved from 3 iterations, each adding to the flexibility and capability of the package. All component modules inherit either the AbstractRecurrent or AbstractSequencer classes. Strong unit testing, continued backwards compatibility and access to supporting material are the principles followed...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1511.07889

5
5.0

Jun 28, 2018
06/18

by
Subhrajit Roy; Arindam Basu

texts

######
eye 5

######
favorite 0

######
comment 0

In this article, we propose a novel Winner-Take-All (WTA) architecture employing neurons with nonlinear dendrites and an online unsupervised structural plasticity rule for training it. Further, to aid hardware implementations, our network employs only binary synapses. The proposed learning rule is inspired by spike time dependent plasticity (STDP) but differs for each dendrite based on its activation level. It trains the WTA network through formation and elimination of connections between...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1512.01314

3
3.0

Jun 29, 2018
06/18

by
Hojjat Salehinejad

texts

######
eye 3

######
favorite 0

######
comment 0

The advantage of recurrent neural networks (RNNs) in learning dependencies between time-series data has distinguished RNNs from other deep learning models. Recently, many advances are proposed in this emerging field. However, there is a lack of comprehensive review on memory models in RNNs in the literature. This paper provides a fundamental review on RNNs and long short term memory (LSTM) model. Then, provides a surveys of recent advances in different memory enhancements and learning...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1602.04335

13
13

Jun 30, 2018
06/18

by
Ronald Hochreiter; Christoph Waldhauser

texts

######
eye 13

######
favorite 0

######
comment 0

The optimization of dynamic problems is both widespread and difficult. When conducting dynamic optimization, a balance between reinitialization and computational expense has to be found. There are multiple approaches to this. In parallel genetic algorithms, multiple sub-populations concurrently try to optimize a potentially dynamic problem. But as the number of sub-population increases, their efficiency decreases. Cultural algorithms provide a framework that has the potential to make...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1401.4714

2
2.0

Jun 30, 2018
06/18

by
Hendrik Richter

texts

######
eye 2

######
favorite 0

######
comment 0

Coevolutionary minimal substrates are simple and abstract models that allow studying the relationships and codynamics between objective and subjective fitness. Using these models an approach is presented for defining and analyzing fitness landscapes of coevolutionary problems. We devise similarity measures of codynamic fitness landscapes and experimentally study minimal substrates of test--based and compositional problems for both cooperative and competitive interaction.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1404.5767

2
2.0

Jun 30, 2018
06/18

by
Anna V. Kononova; David W. Corne; Philippe De Wilde; Vsevolod Shneer; Fabio Caraffini

texts

######
eye 2

######
favorite 0

######
comment 0

Challenging optimisation problems are abundant in all areas of science. Since the 1950s, scientists have developed ever-diversifying families of black box optimisation algorithms designed to address any optimisation problem, requiring only that quality of a candidate solution is calculated via a fitness function specific to the problem. For such algorithms to be successful, at least three properties are required: an effective informed sampling strategy, that guides generation of new candidates...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1408.5350

2
2.0

Jun 30, 2018
06/18

by
Hernan Aguirre; Arnaud Liefooghe; Sébastien Verel; Kiyoshi Tanaka

texts

######
eye 2

######
favorite 0

######
comment 0

This work studies the behavior of three elitist multi- and many-objective evolutionary algorithms generating a high-resolution approximation of the Pareto optimal set. Several search-assessment indicators are defined to trace the dynamics of survival selection and measure the ability to simultaneously keep optimal solutions and discover new ones under different population sizes, set as a fraction of the size of the Pareto optimal set.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1409.7478

9
9.0

Jun 28, 2018
06/18

by
Shayan Poursoltan; FranK Neumann

texts

######
eye 9

######
favorite 0

######
comment 0

Different types of evolutionary algorithms have been developed for constrained continuous optimization. We carry out a feature-based analysis of evolved constrained continuous optimization instances to understand the characteristics of constraints that make problems hard for evolutionary algorithm. In our study, we examine how various sets of constraints can influence the behaviour of e-Constrained Differential Evolution. Investigating the evolved instances, we obtain knowledge of what type of...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1506.06848

2
2.0

Jun 28, 2018
06/18

by
Hojjat Salehinejad; Shahryar Rahnamayan; Hamid R. Tizhoosh

texts

######
eye 2

######
favorite 0

######
comment 0

The differential evolution (DE) algorithm suffers from high computational time due to slow nature of evaluation. In contrast, micro-DE (MDE) algorithms employ a very small population size, which can converge faster to a reasonable solution. However, these algorithms are vulnerable to a premature convergence as well as to high risk of stagnation. In this paper, MDE algorithm with vectorized random mutation factor (MDEVM) is proposed, which utilizes the small size population benefit while...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1512.07980

3
3.0

Jun 29, 2018
06/18

by
Paweł B. Myszkowski; Marek E. Skowroński; Łukasz P. Olech; Krzysztof Oślizło

texts

######
eye 3

######
favorite 0

######
comment 0

In this paper Hybrid Ant Colony Optimization (HAntCO) approach in solving Multi--Skill Resource Constrained Project Scheduling Problem (MS--RCPSP) has been presented. We have proposed hybrid approach that links classical heuristic priority rules for project scheduling with Ant Colony Optimization (ACO). Furthermore, a novel approach for updating pheromone value has been proposed, based on both the best and worst solutions stored by ants. The objective of this paper is to research the usability...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1603.08538

10
10.0

Jun 27, 2018
06/18

by
Jascha A. Schewtschenko

texts

######
eye 10

######
favorite 0

######
comment 0

Kohonen's Self-Organizing Maps (SOMs) have proven to be a successful data-reduction method to identify the intrinsic lower-dimensional sub-manifold of a data set that is scattered in the higher-dimensional feature space. Motivated by the possibly non-Euclidian nature of the feature space and of the intrinsic geometry of the data set, we extend the definition of classic SOMs to obtain the General Riemannian SOM (GRiSOM). We additionally provide an implementation as a proof-of-concept for...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1505.03917

13
13

Jun 28, 2018
06/18

by
Jack Kelly; William Knottenbelt

texts

######
eye 13

######
favorite 0

######
comment 0

Energy disaggregation estimates appliance-by-appliance electricity consumption from a single meter that measures the whole home's electricity demand. Recently, deep neural networks have driven remarkable improvements in classification performance in neighbouring machine learning fields such as image classification and automatic speech recognition. In this paper, we adapt three deep neural network architectures to energy disaggregation: 1) a form of recurrent neural network called `long...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1507.06594

6
6.0

Jun 28, 2018
06/18

by
David Howard; Larry Bull; Pier-Luca Lanzi

texts

######
eye 6

######
favorite 0

######
comment 0

Learning Classifier Systems (LCS) are population-based reinforcement learners that were originally designed to model various cognitive phenomena. This paper presents an explicitly cognitive LCS by using spiking neural networks as classifiers, providing each classifier with a measure of temporal dynamism. We employ a constructivist model of growth of both neurons and synaptic connections, which permits a Genetic Algorithm (GA) to automatically evolve sufficiently-complex neural structures. The...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1508.07700

7
7.0

Jun 28, 2018
06/18

by
K. Eswaran; Vishwajeet Singh

texts

######
eye 7

######
favorite 0

######
comment 0

In this paper we introduce a new method which employs the concept of "Orientation Vectors" to train a feed forward neural network and suitable for problems where large dimensions are involved and the clusters are characteristically sparse. The new method is not NP hard as the problem size increases. We `derive' the method by starting from Kolmogrov's method and then relax some of the stringent conditions. We show for most classification problems three layers are sufficient and the...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1509.05177

9
9.0

Jun 30, 2018
06/18

by
Alex Graves; Greg Wayne; Ivo Danihelka

texts

######
eye 9

######
favorite 0

######
comment 0

We extend the capabilities of neural networks by coupling them to external memory resources, which they can interact with by attentional processes. The combined system is analogous to a Turing Machine or Von Neumann architecture but is differentiable end-to-end, allowing it to be efficiently trained with gradient descent. Preliminary results demonstrate that Neural Turing Machines can infer simple algorithms such as copying, sorting, and associative recall from input and output examples.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1410.5401

8
8.0

Jun 26, 2018
06/18

by
Rama Garimella; Berkay Kicanaoglu; Moncef Gabbouj

texts

######
eye 8

######
favorite 0

######
comment 0

In this research paper novel real/complex valued recurrent Hopfield Neural Network (RHNN) is proposed. The method of synthesizing the energy landscape of such a network and the experimental investigation of dynamics of Recurrent Hopfield Network is discussed. Parallel modes of operation (other than fully parallel mode) in layered RHNN is proposed. Also, certain potential applications are proposed.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1502.02444

6
6.0

Jun 26, 2018
06/18

by
Tobias Friedrich; Timo Kötzing; Martin Krejca; Andrew M. Sutton

texts

######
eye 6

######
favorite 0

######
comment 0

The benefit of sexual recombination is one of the most fundamental questions both in population genetics and evolutionary computation. It is widely believed that recombination helps solving difficult optimization problems. We present the first result, which rigorously proves that it is beneficial to use sexual recombination in an uncertain environment with a noisy fitness function. For this, we model sexual recombination with a simple estimation of distribution algorithm called the Compact...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1502.02793

7
7.0

Jun 26, 2018
06/18

by
Jun He; Yong Wang; Yuren Zhou

texts

######
eye 7

######
favorite 0

######
comment 0

Multi-objective optimisation is regarded as one of the most promising ways for dealing with constrained optimisation problems in evolutionary optimisation. This paper presents a theoretical investigation of a multi-objective optimisation evolutionary algorithm for solving the 0-1 knapsack problem. Two initialisation methods are considered in the algorithm: local search initialisation and greedy search initialisation. Then the solution quality of the algorithm is analysed in terms of the...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1502.03699

5
5.0

Jun 26, 2018
06/18

by
Alireza Goudarzi; Alireza Shabani; Darko Stefanovic

texts

######
eye 5

######
favorite 0

######
comment 0

Supralinear and sublinear pre-synaptic and dendritic integration is considered to be responsible for nonlinear computation power of biological neurons, emphasizing the role of nonlinear integration as opposed to nonlinear output thresholding. How, why, and to what degree the transfer function nonlinearity helps biologically inspired neural network models is not fully understood. Here, we study these questions in the context of echo state networks (ESN). ESN is a simple neural network...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1502.04423

2
2.0

Jun 29, 2018
06/18

by
Bin Liu; Shi Cheng; Yuhui Shi

texts

######
eye 2

######
favorite 0

######
comment 0

In this paper, we are concerned with a branch of evolutionary algorithms termed estimation of distribution (EDA), which has been successfully used to tackle derivative-free global optimization problems. For existent EDA algorithms, it is a common practice to use a Gaussian distribution or a mixture of Gaussian components to represent the statistical property of available promising solutions found so far. Observing that the Student's t distribution has heavier and longer tails than the Gaussian,...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1608.03757

2
2.0

Jun 29, 2018
06/18

by
Soheil Hashemi; Nicholas Anthony; Hokchhay Tann; R. Iris Bahar; Sherief Reda

texts

######
eye 2

######
favorite 0

######
comment 0

Deep neural networks are gaining in popularity as they are used to generate state-of-the-art results for a variety of computer vision and machine learning applications. At the same time, these networks have grown in depth and complexity in order to solve harder problems. Given the limitations in power budgets dedicated to these networks, the importance of low-power, low-memory solutions has been stressed in recent years. While a large number of dedicated hardware using different precisions has...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1612.03940

5
5.0

Jun 30, 2018
06/18

by
Ye Tian; Ran Cheng; Xingyi Zhang; Yaochu Jin

texts

######
eye 5

######
favorite 0

######
comment 0

Over the last three decades, a large number of evolutionary algorithms have been developed for solving multiobjective optimization problems. However, there lacks an up-to-date and comprehensive software platform for researchers to properly benchmark existing algorithms and for practitioners to apply selected algorithms to solve their real-world problems. The demand of such a common tool becomes even more urgent, when the source code of many proposed algorithms has not been made publicly...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1701.00879

3
3.0

Jun 30, 2018
06/18

by
Sadique Sheik; Somnath Paul; Charles Augustine; Gert Cauwenberghs

texts

######
eye 3

######
favorite 0

######
comment 0

Several learning rules for synaptic plasticity, that depend on either spike timing or internal state variables, have been proposed in the past imparting varying computational capabilities to Spiking Neural Networks. Due to design complications these learning rules are typically not implemented on neuromorphic devices leaving the devices to be only capable of inference. In this work we propose a unidirectional post-synaptic potential dependent learning rule that is only triggered by pre-synaptic...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1701.01495

3
3.0

Jun 30, 2018
06/18

by
Elliot Meyerson; Risto Miikkulainen

texts

######
eye 3

######
favorite 0

######
comment 0

Behavior domination is proposed as a tool for understanding and harnessing the power of evolutionary systems to discover and exploit useful stepping stones. Novelty search has shown promise in overcoming deception by collecting diverse stepping stones, and several algorithms have been proposed that combine novelty with a more traditional fitness measure to refocus search and help novelty search scale to more complex domains. However, combinations of novelty and fitness do not necessarily...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1704.05554

4
4.0

Jun 29, 2018
06/18

by
Joachim Ott; Zhouhan Lin; Ying Zhang; Shih-Chii Liu; Yoshua Bengio

texts

######
eye 4

######
favorite 0

######
comment 0

Recurrent Neural Networks (RNNs) produce state-of-art performance on many machine learning tasks but their demand on resources in terms of memory and computational power are often high. Therefore, there is a great interest in optimizing the computations performed with these models especially when considering development of specialized low-power hardware for deep networks. One way of reducing the computational needs is to limit the numerical precision of the network weights and biases. This has...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1608.06902

3
3.0

Jun 29, 2018
06/18

by
Anton Eremeev

texts

######
eye 3

######
favorite 0

######
comment 0

The paper is devoted to upper bounds on the expected first hitting times of the sets of local or global optima for non-elitist genetic algorithms with very high selection pressure. The results of this paper extend the range of situations where the upper bounds on the expected runtime are known for genetic algorithms and apply, in particular, to the Canonical Genetic Algorithm. The obtained bounds do not require the probability of fitness-decreasing mutation to be bounded by a constant less than...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1606.05784

2
2.0

Jun 29, 2018
06/18

by
Farkhondeh Kiaee; Christian Gagné; Mahdieh Abbasi

texts

######
eye 2

######
favorite 0

######
comment 0

The storage and computation requirements of Convolutional Neural Networks (CNNs) can be prohibitive for exploiting these models over low-power or embedded devices. This paper reduces the computational complexity of the CNNs by minimizing an objective function, including the recognition loss that is augmented with a sparsity-promoting penalty term. The sparsity structure of the network is identified using the Alternating Direction Method of Multipliers (ADMM), which is widely used in large...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1611.01590

6
6.0

Jun 26, 2018
06/18

by
James J. Q. Yu; Albert Y. S. Lam; Victor O. K. Li

texts

######
eye 6

######
favorite 0

######
comment 0

The set covering problem (SCP) is one of the representative combinatorial optimization problems, having many practical applications. This paper investigates the development of an algorithm to solve SCP by employing chemical reaction optimization (CRO), a general-purpose metaheuristic. It is tested on a wide range of benchmark instances of SCP. The simulation results indicate that this algorithm gives outstanding performance compared with other heuristics and metaheuristics in solving SCP.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1502.00199

12
12

Jun 28, 2018
06/18

by
Jayanta Basak

texts

######
eye 12

######
favorite 0

######
comment 0

Stochastic optimization is an important task in many optimization problems where the tasks are not expressible as convex optimization problems. In the case of non-convex optimization problems, various different stochastic algorithms like simulated annealing, evolutionary algorithms, and tabu search are available. Most of these algorithms require user-defined parameters specific to the problem in order to find out the optimal solution. Moreover, in many situations, iterative fine-tunings are...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1506.08004

10
10.0

Jun 26, 2018
06/18

by
Alireza Goudarzi; Alireza Shabani; Darko Stefanovic

texts

######
eye 10

######
favorite 0

######
comment 0

Echo state networks (ESN), a type of reservoir computing (RC) architecture, are efficient and accurate artificial neural systems for time series processing and learning. An ESN consists of a core of recurrent neural networks, called a reservoir, with a small number of tunable parameters to generate a high-dimensional representation of an input, and a readout layer which is easily trained using regression to produce a desired output from the reservoir states. Certain computational tasks involve...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1502.00718

8
8.0

Jun 27, 2018
06/18

by
Julien Chevallier; Maria J. Caceres; Marie Doumic; Patricia Reynaud-Bouret

texts

######
eye 8

######
favorite 0

######
comment 0

The spike trains are the main components of the information processing in the brain. To model spike trains several point processes have been investigated in the literature. And more macroscopic approaches have also been studied, using partial differential equation models. The main aim of the present article is to build a bridge between several point processes models (Poisson, Wold, Hawkes) that have been proved to statistically fit real spike trains data and age-structured partial differential...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1506.02361

15
15

Jun 27, 2018
06/18

by
Piotr Szwed; Wojciech Chmiel

texts

######
eye 15

######
favorite 0

######
comment 0

This paper presents a multi-swarm PSO algorithm for the Quadratic Assignment Problem (QAP) implemented on OpenCL platform. Our work was motivated by results of time efficiency tests performed for single-swarm algorithm implementation that showed clearly that the benefits of a parallel execution platform can be fully exploited, if the processed population is large. The described algorithm can be executed in two modes: with independent swarms or with migration. We discuss the algorithm...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1504.05158

6
6.0

Jun 28, 2018
06/18

by
James J. Q. Yu; Victor O. K. Li

texts

######
eye 6

######
favorite 0

######
comment 0

Social Spider Algorithm (SSA) is a recently proposed general-purpose real-parameter metaheuristic designed to solve global numerical optimization problems. This work systematically benchmarks SSA on a suite of 11 functions with different control parameters. We conduct parameter sensitivity analysis of SSA using advanced non-parametric statistical tests to generate statistically significant conclusion on the best performing parameter settings. The conclusion can be adopted in future work to...

Topics: Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1507.02491

3
3.0

Jun 29, 2018
06/18

by
Arkady Rost; Irina Petrova; Arina Buzdalova

texts

######
eye 3

######
favorite 0

######
comment 0

Online parameter controllers for evolutionary algorithms adjust values of parameters during the run of an evolutionary algorithm. Recently a new efficient parameter controller based on reinforcement learning was proposed by Karafotias et al. In this method ranges of parameters are discretized into several intervals before the run. However, performing adaptive discretization during the run may increase efficiency of an evolutionary algorithm. Aleti et al. proposed another efficient controller...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1603.06788

2
2.0

Jun 29, 2018
06/18

by
Dimo Brockhoff; Tea Tušar; Dejan Tušar; Tobias Wagner; Nikolaus Hansen; Anne Auger

texts

######
eye 2

######
favorite 0

######
comment 0

This document details the rationales behind assessing the performance of numerical black-box optimizers on multi-objective problems within the COCO platform and in particular on the biobjective test suite bbob-biobj. The evaluation is based on a hypervolume of all non-dominated solutions in the archive of candidate solutions and measures the runtime until the hypervolume value succeeds prescribed target values.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1605.01746

3
3.0

Jun 29, 2018
06/18

by
Antonio Jimeno Yepes; Jianbin Tang

texts

######
eye 3

######
favorite 0

######
comment 0

Deep Neural Networks (DNN) have achieved human level performance in many image analytics tasks but DNNs are mostly deployed to GPU platforms that consume a considerable amount of power. Brain-inspired spiking neuromorphic chips consume low power and can be highly parallelized. However, for deploying DNNs to energy efficient neuromorphic chips the incompatibility between continuous neurons and synaptic weights of traditional DNNs, discrete spiking neurons and synapses of neuromorphic chips has...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1605.07740

3
3.0

Jun 29, 2018
06/18

by
Amirhossein Tavanaei; Anthony S Maida

texts

######
eye 3

######
favorite 0

######
comment 0

It is of some interest to understand how statistically based mechanisms for signal processing might be integrated with biologically motivated mechanisms such as neural networks. This paper explores a novel hybrid approach for classifying segments of sequential data, such as individual spoken works. The approach combines a hidden Markov model (HMM) with a spiking neural network (SNN). The HMM, consisting of states and transitions, forms a fixed backbone with nonadaptive transition probabilities....

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1606.00825

6
6.0

Jun 28, 2018
06/18

by
Tao Xu; Jun He

texts

######
eye 6

######
favorite 0

######
comment 0

Solving constrained optimization problems by multi-objective evolutionary algorithms has scored tremendous achievements in the last decade. Standard multi-objective schemes usually aim at minimizing the objective function and also the degree of constraint violation simultaneously. This paper proposes a new multi-objective method for solving constrained optimization problems. The new method keeps two standard objectives: the original objective function and the sum of degrees of constraint...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1509.09060

3
3.0

Jun 28, 2018
06/18

by
Duc-Cuong Dang; Anton V. Eremeev; Per Kristian Lehre

texts

######
eye 3

######
favorite 0

######
comment 0

The paper is devoted to upper bounds on run-time of Non-Elitist Genetic Algorithms until some target subset of solutions is visited for the first time. In particular, we consider the sets of optimal solutions and the sets of local optima as the target subsets. Previously known upper bounds are improved by means of drift analysis. Finally, we propose conditions ensuring that a Non-Elitist Genetic Algorithm efficiently finds approximate solutions with constant approximation ratio on the class of...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1512.02047

17
17

Jun 28, 2018
06/18

by
Tim Taylor; Alan Dorin; Kevin Korb

texts

######
eye 17

######
favorite 0

######
comment 0

The application of evolution in the digital realm, with the goal of creating artificial intelligence and artificial life, has a history as long as that of the digital computer itself. We illustrate the intertwined history of these ideas, starting with the early theoretical work of John von Neumann and the pioneering experimental work of Nils Aall Barricelli. We argue that evolutionary thinking and artificial life will continue to play an integral role in the future development of the digital...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1512.02100

3
3.0

Jun 29, 2018
06/18

by
Michael Bukatin; Steve Matthews; Andrey Radul

texts

######
eye 3

######
favorite 0

######
comment 0

Dataflow matrix machines are a powerful generalization of recurrent neural networks. They work with multiple types of arbitrary linear streams, multiple types of powerful neurons, and allow to incorporate higher-order constructions. We expect them to be useful in machine learning and probabilistic programming, and in the synthesis of dynamic systems and of deterministic and probabilistic programs.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1603.09002

6
6.0

Jun 28, 2018
06/18

by
Jonathan Binas; Giacomo Indiveri; Michael Pfeiffer

texts

######
eye 6

######
favorite 0

######
comment 0

Solving constraint satisfaction problems (CSPs) is a notoriously expensive computational task. Recently, it has been proposed that efficient stochastic solvers can be obtained through appropriately configured spiking neural networks performing Markov Chain Monte Carlo (MCMC) sampling. The possibility to run such models on massively parallel, low-power neuromorphic hardware holds great promise; however, previously proposed networks are based on probabilistically spiking neurons, and thus rely on...

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1511.00540