2
2.0
Jun 28, 2018
06/18
by
Tim Dettmers
texts
eye 2
favorite 0
comment 0
The creation of practical deep learning data-products often requires parallelization across processors and computers to make deep learning feasible on large data sets, but bottlenecks in communication bandwidth make it difficult to attain good speedups through parallelism. Here we develop and test 8-bit approximation algorithms which make better use of the available bandwidth by compressing 32-bit gradients and nonlinear activations to 8-bit approximations. We show that these approximations do...
Topics: Neural and Evolutionary Computing, Learning, Computing Research Repository
Source: http://arxiv.org/abs/1511.04561
5
5.0
Jun 29, 2018
06/18
by
Dan Hendrycks; Kevin Gimpel
texts
eye 5
favorite 0
comment 0
We consider the two related problems of detecting if an example is misclassified or out-of-distribution. We present a simple baseline that utilizes probabilities from softmax distributions. Correctly classified examples tend to have greater maximum softmax probabilities than erroneously classified and out-of-distribution examples, allowing for their detection. We assess performance by defining several tasks in computer vision, natural language processing, and automatic speech recognition,...
Topics: Computer Vision and Pattern Recognition, Neural and Evolutionary Computing, Computing Research...
Source: http://arxiv.org/abs/1610.02136
7
7.0
Jun 29, 2018
06/18
by
Fathi M. Salem
texts
eye 7
favorite 0
comment 0
We present a model of a basic recurrent neural network (or bRNN) that includes a separate linear term with a slightly "stable" fixed matrix to guarantee bounded solutions and fast dynamic response. We formulate a state space viewpoint and adapt the constrained optimization Lagrange Multiplier (CLM) technique and the vector Calculus of Variations (CoV) to derive the (stochastic) gradient descent. In this process, one avoids the commonly used re-application of the circular chain-rule...
Topics: Machine Learning, Neural and Evolutionary Computing, Computing Research Repository, Statistics
Source: http://arxiv.org/abs/1612.09022
2
2.0
Jun 30, 2018
06/18
by
Shin-ichi Maeda
texts
eye 2
favorite 0
comment 0
Dropout is one of the key techniques to prevent the learning from overfitting. It is explained that dropout works as a kind of modified L2 regularization. Here, we shed light on the dropout from Bayesian standpoint. Bayesian interpretation enables us to optimize the dropout rate, which is beneficial for learning of weight parameters and prediction after learning. The experiment result also encourages the optimization of the dropout.
Topics: Neural and Evolutionary Computing, Machine Learning, Computing Research Repository, Statistics,...
Source: http://arxiv.org/abs/1412.7003
3
3.0
Jun 30, 2018
06/18
by
Yu Chen; Weicheng Xie; Xiufen Zou
texts
eye 3
favorite 0
comment 0
Although real-coded differential evolution (DE) algorithms can perform well on continuous optimization problems (CoOPs), it is still a challenging task to design an efficient binary-coded DE algorithm. Inspired by the learning mechanism of particle swarm optimization (PSO) algorithms, we propose a binary learning differential evolution (BLDE) algorithm that can efficiently locate the global optimal solutions by learning from the last population. Then, we theoretically prove the global...
Topics: Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1401.1124
4
4.0
Jun 28, 2018
06/18
by
Jean Liénard; Benoît Girard
texts
eye 4
favorite 0
comment 0
The basal ganglia nuclei form a complex network of nuclei often assumed to perform selection, yet their individual roles and how they influence each other is still largely unclear. In particular, the ties between the external and internal parts of the globus pallidus are paradoxical, as anatomical data suggest a potent inhibitory projection between them while electrophys-iological recordings indicate that they have similar activities. Here we introduce a theoretical study that reconciles both...
Topics: Quantitative Biology, Neurons and Cognition, Neural and Evolutionary Computing, Computing Research...
Source: http://arxiv.org/abs/1512.00035
2
2.0
Jun 30, 2018
06/18
by
Robert A. Murphy
texts
eye 2
favorite 0
comment 0
As the title suggests, we will describe (and justify through the presentation of some of the relevant mathematics) prediction methodologies for sensor measurements. This exposition will mainly be concerned with the mathematics related to modeling the sensor measurements.
Topics: Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1704.00207
2
2.0
Jun 30, 2018
06/18
by
Francesco Bonanno; Giacomo Capizzi; Grazia Lo Sciuto; Christian Napoli; Giuseppe Pappalardo; Emiliano Tramontana
texts
eye 2
favorite 0
comment 0
Surface plasmon polaritons (SPPs) confined along metal-dielectric interface have attracted a relevant interest in the area of ultracompact photonic circuits, photovoltaic devices and other applications due to their strong field confinement and enhancement. This paper investigates a novel cascade neural network (NN) architecture to find the dependance of metal thickness on the SPP propagation. Additionally, a novel training procedure for the proposed cascade NN has been developed using an...
Topics: Distributed, Parallel, and Cluster Computing, Neural and Evolutionary Computing, Computing Research...
Source: http://arxiv.org/abs/1406.3149
3
3.0
Jun 29, 2018
06/18
by
Alexandre de Brébisson; Pascal Vincent
texts
eye 3
favorite 0
comment 0
The softmax content-based attention mechanism has proven to be very beneficial in many applications of recurrent neural networks. Nevertheless it suffers from two major computational limitations. First, its computations for an attention lookup scale linearly in the size of the attended sequence. Second, it does not encode the sequence into a fixed-size representation but instead requires to memorize all the hidden states. These two limitations restrict the use of the softmax attention mechanism...
Topics: Information Retrieval, Machine Learning, Statistics, Learning, Neural and Evolutionary Computing,...
Source: http://arxiv.org/abs/1609.05866
2
2.0
Jun 30, 2018
06/18
by
Albert H. R. Ko; Robert Sabourin; Alceu S. Britto; Luiz E. S. Oliveira
texts
eye 2
favorite 0
comment 0
The Ensemble of Classifiers (EoC) has been shown to be effective in improving the performance of single classifiers by combining their outputs, and one of the most important properties involved in the selection of the best EoC from a pool of classifiers is considered to be classifier diversity. In general, classifier diversity does not occur randomly, but is generated systematically by various ensemble creation methods. By using diverse data subsets to train classifiers, these methods can...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Learning
Source: http://arxiv.org/abs/1408.2889
8
8.0
Jun 30, 2018
06/18
by
Jan Koutník; Klaus Greff; Faustino Gomez; Jürgen Schmidhuber
texts
eye 8
favorite 0
comment 0
Sequence prediction and classification are ubiquitous and challenging problems in machine learning that can require identifying complex dependencies between temporally distant inputs. Recurrent Neural Networks (RNNs) have the ability, in theory, to cope with these temporal dependencies by virtue of the short-term memory implemented by their recurrent (feedback) connections. However, in practice they are difficult to train successfully when the long-term memory is required. This paper introduces...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Learning
Source: http://arxiv.org/abs/1402.3511
6
6.0
Jun 28, 2018
06/18
by
David Howard; Larry Bull; Pier-Luca Lanzi
texts
eye 6
favorite 0
comment 0
Learning Classifier Systems (LCS) are population-based reinforcement learners that were originally designed to model various cognitive phenomena. This paper presents an explicitly cognitive LCS by using spiking neural networks as classifiers, providing each classifier with a measure of temporal dynamism. We employ a constructivist model of growth of both neurons and synaptic connections, which permits a Genetic Algorithm (GA) to automatically evolve sufficiently-complex neural structures. The...
Topics: Computing Research Repository, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1508.07700
30
30
Jun 27, 2018
06/18
by
Dhagash Mehta; Crina Grosan
texts
eye 30
favorite 0
comment 0
Function optimization and finding simultaneous solutions of a system of nonlinear equations (SNE) are two closely related and important optimization problems. However, unlike in the case of function optimization in which one is required to find the global minimum and sometimes local minima, a database of challenging SNEs where one is required to find stationary points (extrama and saddle points) is not readily available. In this article, we initiate building such a database of important SNE...
Topics: Mathematical Software, Neural and Evolutionary Computing, Numerical Analysis, Optimization and...
Source: http://arxiv.org/abs/1504.02366
3
3.0
Jun 30, 2018
06/18
by
Chunpeng Wu; Wei Wen; Tariq Afzal; Yongmei Zhang; Yiran Chen; Hai Li
texts
eye 3
favorite 0
comment 0
Recently, DNN model compression based on network architecture design, e.g., SqueezeNet, attracted a lot attention. No accuracy drop on image classification is observed on these extremely compact networks, compared to well-known models. An emerging question, however, is whether these model compression techniques hurt DNN's learning ability other than classifying images on a single dataset. Our preliminary experiment shows that these compression methods could degrade domain adaptation (DA)...
Topics: Neural and Evolutionary Computing, Artificial Intelligence, Computing Research Repository, Computer...
Source: http://arxiv.org/abs/1703.04071
2
2.0
Jun 30, 2018
06/18
by
Alireza Goudarzi; Peter Banda; Matthew R. Lakin; Christof Teuscher; Darko Stefanovic
texts
eye 2
favorite 0
comment 0
Reservoir computing (RC) is a novel approach to time series prediction using recurrent neural networks. In RC, an input signal perturbs the intrinsic dynamics of a medium called a reservoir. A readout layer is then trained to reconstruct a target output from the reservoir's state. The multitude of RC architectures and evaluation metrics poses a challenge to both practitioners and theorists who study the task-solving performance and computational power of RC. In addition, in contrast to...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Learning
Source: http://arxiv.org/abs/1401.2224
31
31
Jun 29, 2018
06/18
by
Albert Zeyer; Patrick Doetsch; Paul Voigtlaender; Ralf Schlüter; Hermann Ney
texts
eye 31
favorite 0
comment 0
We present a comprehensive study of deep bidirectional long short-term memory (LSTM) recurrent neural network (RNN) based acoustic models for automatic speech recognition (ASR). We study the effect of size and depth and train models of up to 8 layers. We investigate the training aspect and study different variants of optimization methods, batching, truncated backpropagation, different regularization techniques such as dropout and $L_2$ regularization, and different gradient clipping variants....
Topics: Learning, Sound, Neural and Evolutionary Computing, Computing Research Repository, Computation and...
Source: http://arxiv.org/abs/1606.06871
2
2.0
Jun 30, 2018
06/18
by
Ilya Loshchilov
texts
eye 2
favorite 0
comment 0
We propose a computationally efficient limited memory Covariance Matrix Adaptation Evolution Strategy for large scale optimization, which we call the LM-CMA-ES. The LM-CMA-ES is a stochastic, derivative-free algorithm for numerical optimization of non-linear, non-convex optimization problems in continuous domain. Inspired by the limited memory BFGS method of Liu and Nocedal (1989), the LM-CMA-ES samples candidate solutions according to a covariance matrix reproduced from $m$ direction vectors...
Topics: Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1404.5520
4
4.0
Jun 30, 2018
06/18
by
Yaoyuan Zhang; Zhenxu Ye; Yansong Feng; Dongyan Zhao; Rui Yan
texts
eye 4
favorite 0
comment 0
Sentence simplification reduces semantic complexity to benefit people with language impairments. Previous simplification studies on the sentence level and word level have achieved promising results but also meet great challenges. For sentence-level studies, sentences after simplification are fluent but sometimes are not really simplified. For word-level studies, words are simplified but also have potential grammar errors due to different usages of words before and after simplification. In this...
Topics: Neural and Evolutionary Computing, Artificial Intelligence, Computing Research Repository,...
Source: http://arxiv.org/abs/1704.02312
4
4.0
Jun 29, 2018
06/18
by
Luke B. Godfrey; Michael S. Gashler
texts
eye 4
favorite 0
comment 0
We present the soft exponential activation function for artificial neural networks that continuously interpolates between logarithmic, linear, and exponential functions. This activation function is simple, differentiable, and parameterized so that it can be trained as the rest of the network is trained. We hypothesize that soft exponential has the potential to improve neural network learning, as it can exactly calculate many natural operations that typical neural networks can only approximate,...
Topics: Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1602.01321
19
19
Jun 27, 2018
06/18
by
Shaoqiu Zheng; Junzhi Li; Andreas Janecek; Ying Tan
texts
eye 19
favorite 0
comment 0
This paper presents a cooperative framework for fireworks algorithm (CoFFWA). A detailed analysis of existing fireworks algorithm (FWA) and its recently developed variants has revealed that (i) the selection strategy lead to the contribution of the firework with the best fitness (core firework) for the optimization overwhelms the contributions of the rest of fireworks (non-core fireworks) in the explosion operator, (ii) the Gaussian mutation operator is not as effective as it is designed to be....
Topics: Computing Research Repository, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1505.00075
3
3.0
Jun 30, 2018
06/18
by
H. Sebastian Seung; Jonathan Zung
texts
eye 3
favorite 0
comment 0
Much has been learned about plasticity of biological synapses from empirical studies. Hebbian plasticity is driven by correlated activity of presynaptic and postsynaptic neurons. Synapses that converge onto the same neuron often behave as if they compete for a fixed resource; some survive the competition while others are eliminated. To provide computational interpretations of these aspects of synaptic plasticity, we formulate unsupervised learning as a zero-sum game between Hebbian excitation...
Topics: Neural and Evolutionary Computing, Neurons and Cognition, Computing Research Repository,...
Source: http://arxiv.org/abs/1704.00646
16
16
Jun 27, 2018
06/18
by
Zachary C. Lipton; John Berkowitz; Charles Elkan
texts
eye 16
favorite 0
comment 0
Countless learning tasks require dealing with sequential data. Image captioning, speech synthesis, and music generation all require that a model produce outputs that are sequences. In other domains, such as time series prediction, video analysis, and musical information retrieval, a model must learn from inputs that are sequences. Interactive tasks, such as translating natural language, engaging in dialogue, and controlling a robot, often demand both capabilities. Recurrent neural networks...
Topics: Computing Research Repository, Learning, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1506.00019
6
6.0
Jun 30, 2018
06/18
by
Ke Ding; Ying Tan
texts
eye 6
favorite 0
comment 0
Benchmarking is key for developing and comparing optimization algorithms. In this paper, a CUDA-based real parameter optimization benchmark (cuROB) is introduced. Test functions of diverse properties are included within cuROB and implemented efficiently with CUDA. Speedup of one order of magnitude can be achieved in comparison with CPU-based benchmark of CEC'14.
Topics: Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1407.7737
2
2.0
Jun 30, 2018
06/18
by
Yin Zheng; Yu-Jin Zhang; Hugo Larochelle
texts
eye 2
favorite 0
comment 0
Topic modeling based on latent Dirichlet allocation (LDA) has been a framework of choice to deal with multimodal data, such as in image annotation tasks. Another popular approach to model the multimodal data is through deep neural networks, such as the deep Boltzmann machine (DBM). Recently, a new type of topic model called the Document Neural Autoregressive Distribution Estimator (DocNADE) was proposed and demonstrated state-of-the-art performance for text document modeling. In this work, we...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Computer Vision and Pattern...
Source: http://arxiv.org/abs/1409.3970
2
2.0
Jun 28, 2018
06/18
by
Shengxian Wan; Yanyan Lan; Jiafeng Guo; Jun Xu; Liang Pang; Xueqi Cheng
texts
eye 2
favorite 0
comment 0
Matching natural language sentences is central for many applications such as information retrieval and question answering. Existing deep models rely on a single sentence representation or multiple granularity representations for matching. However, such methods cannot well capture the contextualized local information in the matching process. To tackle this problem, we present a new deep architecture to match two sentences with multiple positional sentence representations. Specifically, each...
Topics: Neural and Evolutionary Computing, Computation and Language, Artificial Intelligence, Computing...
Source: http://arxiv.org/abs/1511.08277
2
2.0
Jun 30, 2018
06/18
by
Prasanna Kumar Muthukumar; Alan W. Black
texts
eye 2
favorite 0
comment 0
Nearly all Statistical Parametric Speech Synthesizers today use Mel Cepstral coefficients as the vocal tract parameterization of the speech signal. Mel Cepstral coefficients were never intended to work in a parametric speech synthesis framework, but as yet, there has been little success in creating a better parameterization that is more suited to synthesis. In this paper, we use deep learning algorithms to investigate a data-driven parameterization technique that is designed for the specific...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Computation and Language, Learning
Source: http://arxiv.org/abs/1409.8558
8
8.0
Jun 27, 2018
06/18
by
Hongyu Guo; Xiaodan Zhu; Martin Renqiang Min
texts
eye 8
favorite 0
comment 0
Many real-world applications are associated with structured data, where not only input but also output has interplay. However, typical classification and regression models often lack the ability of simultaneously exploring high-order interaction within input and that within output. In this paper, we present a deep learning model aiming to generate a powerful nonlinear functional mapping from structured input to structured output. More specifically, we propose to integrate high-order hidden...
Topics: Learning, Computing Research Repository, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1504.08022
11
11
Jun 28, 2018
06/18
by
Fandong Meng; Zhengdong Lu; Zhaopeng Tu; Hang Li; Qun Liu
texts
eye 11
favorite 0
comment 0
We propose DEEPMEMORY, a novel deep architecture for sequence-to-sequence learning, which performs the task through a series of nonlinear transformations from the representation of the input sequence (e.g., a Chinese sentence) to the final output sequence (e.g., translation to English). Inspired by the recently proposed Neural Turing Machine (Graves et al., 2014), we store the intermediate representations in stacked layers of memories, and use read-write operations on the memories to realize...
Topics: Computation and Language, Computing Research Repository, Learning, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1506.06442
3
3.0
Jun 29, 2018
06/18
by
Martin Simonovsky; Benjamín Gutiérrez-Becker; Diana Mateus; Nassir Navab; Nikos Komodakis
texts
eye 3
favorite 0
comment 0
Multimodal registration is a challenging problem in medical imaging due the high variability of tissue appearance under different imaging modalities. The crucial component here is the choice of the right similarity measure. We make a step towards a general learning-based solution that can be adapted to specific situations and present a metric based on a convolutional neural network. Our network can be trained from scratch even from a few aligned image pairs. The metric is validated on...
Topics: Computer Vision and Pattern Recognition, Neural and Evolutionary Computing, Computing Research...
Source: http://arxiv.org/abs/1609.05396
2
2.0
Jun 30, 2018
06/18
by
Alexander W. Churchill; Siddharth Sigtia; Chrisantha Fernando
texts
eye 2
favorite 0
comment 0
An algorithm is described that adaptively learns a non-linear mutation distribution. It works by training a denoising autoencoder (DA) online at each generation of a genetic algorithm to reconstruct a slowly decaying memory of the best genotypes so far. A compressed hidden layer forces the autoencoder to learn hidden features in the training set that can be used to accelerate search on novel problems with similar structure. Its output neurons define a probability distribution that we sample...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Learning
Source: http://arxiv.org/abs/1404.1614
8
8.0
Jun 28, 2018
06/18
by
Yang Liu; Furu Wei; Sujian Li; Heng Ji; Ming Zhou; Houfeng Wang
texts
eye 8
favorite 0
comment 0
Previous research on relation classification has verified the effectiveness of using dependency shortest paths or subtrees. In this paper, we further explore how to make full use of the combination of these dependency information. We first propose a new structure, termed augmented dependency path (ADP), which is composed of the shortest dependency path between two entities and the subtrees attached to the shortest path. To exploit the semantic representation behind the ADP structure, we develop...
Topics: Computation and Language, Computing Research Repository, Learning, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1507.04646
3
3.0
Jun 30, 2018
06/18
by
Eric W. Tramel; Marylou Gabrié; Andre Manoel; Francesco Caltagirone; Florent Krzakala
texts
eye 3
favorite 0
comment 0
Restricted Boltzmann machines (RBMs) are energy-based neural-networks which are commonly used as the building blocks for deep architectures neural architectures. In this work, we derive a deterministic framework for the training, evaluation, and use of RBMs based upon the Thouless-Anderson-Palmer (TAP) mean-field approximation of widely-connected systems with weak interactions coming from spin-glass theory. While the TAP approach has been extensively studied for fully-visible binary spin...
Topics: Condensed Matter, Learning, Disordered Systems and Neural Networks, Computing Research Repository,...
Source: http://arxiv.org/abs/1702.03260
4
4.0
Jun 29, 2018
06/18
by
Jonas Degrave; Michiel Hermans; Joni Dambre; Francis wyffels
texts
eye 4
favorite 0
comment 0
One of the most important fields in robotics is the optimization of controllers. Currently, robots are treated as a black box in this optimization process, which is the reason why derivative-free optimization methods such as evolutionary algorithms or reinforcement learning are omnipresent. We propose an implementation of a modern physics engine, which has the ability to differentiate control parameters. This has been implemented on both CPU and GPU. We show how this speeds up the optimization...
Topics: Artificial Intelligence, Neural and Evolutionary Computing, Computing Research Repository, Robotics
Source: http://arxiv.org/abs/1611.01652
2
2.0
Jun 30, 2018
06/18
by
Michael R. Smith; Aaron J. Hill; Kristofor D. Carlson; Craig M. Vineyard; Jonathon Donaldson; David R. Follett; Pamela L. Follett; John H. Naegle; Conrad D. James; James B. Aimone
texts
eye 2
favorite 0
comment 0
Information in neural networks is represented as weighted connections, or synapses, between neurons. This poses a problem as the primary computational bottleneck for neural networks is the vector-matrix multiply when inputs are multiplied by the neural network weights. Conventional processing architectures are not well suited for simulating neural networks, often requiring large amounts of energy and time. Additionally, synapses in biological neural networks are not binary connections, but...
Topics: Neurons and Cognition, Computing Research Repository, Machine Learning, Quantitative Biology,...
Source: http://arxiv.org/abs/1704.08306
3
3.0
Jun 29, 2018
06/18
by
Martijn Arts; Marius Cordts; Monika Gorin; Marc Spehr; Rudolf Mathar
texts
eye 3
favorite 0
comment 0
This paper investigates a discontinuous neural network which is used as a model of the mammalian olfactory system and can more generally be applied to solve non-negative sparse approximation problems. By inherently limiting the systems integrators to having non-negative outputs, the system function becomes discontinuous since the integrators switch between being inactive and being active. It is shown that the presented network converges to equilibrium points which are solutions to general...
Topics: Mathematics, Optimization and Control, Neurons and Cognition, Quantitative Biology, Neural and...
Source: http://arxiv.org/abs/1603.06353
4
4.0
Jun 29, 2018
06/18
by
E. Osaba; Xin-She Yang; F. Diaz; E. Onieva; A. D. Masegosa; A. Perallos
texts
eye 4
favorite 0
comment 0
A real-world newspaper distribution problem with recycling policy is tackled in this work. In order to meet all the complex restrictions contained in such a problem, it has been modeled as a rich vehicle routing problem, which can be more specifically considered as an asymmetric and clustered vehicle routing problem with simultaneous pickup and deliveries, variable costs and forbidden paths (AC-VRP-SPDVCFP). This is the first study of such a problem in the literature. For this reason, a...
Topics: Optimization and Control, Artificial Intelligence, Neural and Evolutionary Computing, Computing...
Source: http://arxiv.org/abs/1604.04146
2
2.0
Jun 29, 2018
06/18
by
Johannes Welbl; Guillaume Bouchard; Sebastian Riedel
texts
eye 2
favorite 0
comment 0
Embedding-based Knowledge Base Completion models have so far mostly combined distributed representations of individual entities or relations to compute truth scores of missing links. Facts can however also be represented using pairwise embeddings, i.e. embeddings for pairs of entities and relations. In this paper we explore such bigram embeddings with a flexible Factorization Machine model and several ablations from it. We investigate the relevance of various bigram types on the fb15k237...
Topics: Computation and Language, Machine Learning, Artificial Intelligence, Statistics, Neural and...
Source: http://arxiv.org/abs/1604.05878
2
2.0
Jun 30, 2018
06/18
by
Fabio D'Andreagiovanni; Antonella Nardin; Enrico Natalizio
texts
eye 2
favorite 0
comment 0
We consider the problem of optimally designing a body wireless sensor network, while taking into account the uncertainty of data generation of biosensors. Since the related min-max robustness Integer Linear Programming (ILP) problem can be difficult to solve even for state-of-the-art commercial optimization solvers, we propose an original heuristic for its solution. The heuristic combines deterministic and probabilistic variable fixing strategies, guided by the information coming from...
Topics: Optimization and Control, Neural and Evolutionary Computing, Networking and Internet Architecture,...
Source: http://arxiv.org/abs/1704.04640
9
9.0
Jun 28, 2018
06/18
by
Shayan Poursoltan; FranK Neumann
texts
eye 9
favorite 0
comment 0
Different types of evolutionary algorithms have been developed for constrained continuous optimization. We carry out a feature-based analysis of evolved constrained continuous optimization instances to understand the characteristics of constraints that make problems hard for evolutionary algorithm. In our study, we examine how various sets of constraints can influence the behaviour of e-Constrained Differential Evolution. Investigating the evolved instances, we obtain knowledge of what type of...
Topics: Computing Research Repository, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1506.06848
6
6.0
Jun 28, 2018
06/18
by
Shayan Poursoltan; Frank Neumann
texts
eye 6
favorite 0
comment 0
Evolutionary algorithms have been frequently applied to constrained continuous optimisation problems. We carry out feature based comparisons of different types of evolutionary algorithms such as evolution strategies, differential evolution and particle swarm optimisation for constrained continuous optimisation. In our study, we examine how sets of constraints influence the difficulty of obtaining close to optimal solutions. Using a multi-objective approach, we evolve constrained continuous...
Topics: Artificial Intelligence, Computing Research Repository, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1509.06842
3
3.0
Jun 29, 2018
06/18
by
Shayan Poursoltan; Frank Neumann
texts
eye 3
favorite 0
comment 0
With this paper, we contribute to the growing research area of feature-based analysis of bio-inspired computing. In this research area, problem instances are classified according to different features of the underlying problem in terms of their difficulty of being solved by a particular algorithm. We investigate the impact of different sets of evolved instances for building prediction models in the area of algorithm selection. Building on the work of Poursoltan and Neumann [11,10], we consider...
Topics: Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1602.02862
3
3.0
Jun 29, 2018
06/18
by
Marcin Wozniak; Dawid Polap; Grzegorz Borowik; Christian Napoli
texts
eye 3
favorite 0
comment 0
In this paper, the idea of client verification in distributed systems is presented. The proposed solution presents a sample system where client verification through cloud resources using input signature is discussed. For different signatures the proposed method has been examined. Research results are presented and discussed to show potential advantages.
Topics: Cryptography and Security, Artificial Intelligence, Neural and Evolutionary Computing, Computing...
Source: http://arxiv.org/abs/1601.07446
9
9.0
Jun 27, 2018
06/18
by
Shiliang Zhang; Hui Jiang; Mingbin Xu; Junfeng Hou; Lirong Dai
texts
eye 9
favorite 0
comment 0
In this paper, we propose the new fixed-size ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language models (FNN-LMs). Experimental results have shown that without using any recurrent feedbacks, FOFE...
Topics: Computation and Language, Computing Research Repository, Learning, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1505.01504
3
3.0
Jun 30, 2018
06/18
by
Fabricio Olivetti de Franca; Guilherme Palermo Coelho
texts
eye 3
favorite 0
comment 0
Most community detection algorithms from the literature work as optimization tools that minimize a given \textit{fitness function}, while assuming that each node belongs to a single community. Since there is no hard concept of what a community is, most proposed fitness functions focus on a particular definition. As such, these functions do not always lead to partitions that correspond to those observed in practice. This paper proposes a new flexible fitness function that allows the...
Topics: Physics, Neural and Evolutionary Computing, Computing Research Repository, Physics and Society,...
Source: http://arxiv.org/abs/1406.2545
2
2.0
Jun 29, 2018
06/18
by
Ilija Ilievski; Shuicheng Yan; Jiashi Feng
texts
eye 2
favorite 0
comment 0
Visual Question and Answering (VQA) problems are attracting increasing interest from multiple research disciplines. Solving VQA problems requires techniques from both computer vision for understanding the visual contents of a presented image or video, as well as the ones from natural language processing for understanding semantics of the question and generating the answers. Regarding visual content modeling, most of existing VQA methods adopt the strategy of extracting global features from the...
Topics: Computer Vision and Pattern Recognition, Computation and Language, Computing Research Repository,...
Source: http://arxiv.org/abs/1604.01485
2
2.0
Jun 30, 2018
06/18
by
Alexandre Chotard; Martin Holena
texts
eye 2
favorite 0
comment 0
Several recent publications investigated Markov-chain modelling of linear optimization by a $(1,\lambda)$-ES, considering both unconstrained and linearly constrained optimization, and both constant and varying step size. All of them assume normality of the involved random steps, and while this is consistent with a black-box scenario, information on the function to be optimized (e.g. separability) may be exploited by the use of another distribution. The objective of our contribution is to...
Topics: Neural and Evolutionary Computing, Numerical Analysis, Computing Research Repository, Learning
Source: http://arxiv.org/abs/1406.4619
19
19
Jun 27, 2018
06/18
by
Yunchen Pu; Xin Yuan; Lawrence Carin
texts
eye 19
favorite 0
comment 0
A generative model is developed for deep (multi-layered) convolutional dictionary learning. A novel probabilistic pooling operation is integrated into the deep model, yielding efficient bottom-up (pretraining) and top-down (refinement) probabilistic learning. Experimental results demonstrate powerful capabilities of the model to learn multi-layer features from images, and excellent classification results are obtained on the MNIST and Caltech 101 datasets.
Topics: Machine Learning, Learning, Computing Research Repository, Statistics, Neural and Evolutionary...
Source: http://arxiv.org/abs/1504.04054
10
10.0
Jun 28, 2018
06/18
by
Maxim Borisyak; Andrey Ustyuzhanin
texts
eye 10
favorite 0
comment 0
The problem of autonomous navigation is one of the basic problems for robotics. Although, in general, it may be challenging when an autonomous vehicle is placed into partially observable domain. In this paper we consider simplistic environment model and introduce a navigation algorithm based on Learning Classifier System.
Topics: Artificial Intelligence, Computing Research Repository, Learning, Neural and Evolutionary Computing
Source: http://arxiv.org/abs/1507.07374
4
4.0
Jun 30, 2018
06/18
by
Md. Selim; Saeed Siddik; Alim Ul Gias; M. Abdullah-Al-Wadud; Shah Mostafa Khaled
texts
eye 4
favorite 0
comment 0
The potential benefit of migrating software design from Structured to Object Oriented Paradigm is manifolded including modularity, manageability and extendability. This design migration should be automated as it will reduce the time required in manual process. Our previous work has addressed this issue in terms of optimal graph clustering problem formulated by a quadratic Integer Program (IP). However, it has been realized that solution to the IP is computationally hard and thus heuristic based...
Topics: Neural and Evolutionary Computing, Computing Research Repository, Software Engineering
Source: http://arxiv.org/abs/1407.6116
2
2.0
Jun 28, 2018
06/18
by
Joan Serrà; Aleksandar Matic; Josep Luis Arcos; Alexandros Karatzoglou
texts
eye 2
favorite 0
comment 0
Finding repeated patterns or motifs in a time series is an important unsupervised task that has still a number of open issues, starting by the definition of motif. In this paper, we revise the notion of motif support, characterizing it as the number of patterns or repetitions that define a motif. We then propose GENMOTIF, a genetic algorithm to discover motifs with support which, at the same time, is flexible enough to accommodate other motif specifications and task characteristics. GENMOTIF is...
Topics: Learning, Neural and Evolutionary Computing, Computing Research Repository
Source: http://arxiv.org/abs/1511.04986