17
17

Jun 27, 2018
06/18

by
Yasser Roudi; Graham Taylor

texts

######
eye 17

######
favorite 0

######
comment 0

Learning and inferring features that generate sensory input is a task continuously performed by cortex. In recent years, novel algorithms and learning rules have been proposed that allow neural network models to learn such features from natural images, written text, audio signals, etc. These networks usually involve deep architectures with many layers of hidden neurons. Here we review recent advancements in this area emphasizing, amongst other things, the processing of dynamical inputs by...

Topics: Quantitative Biology, Neurons and Cognition, Condensed Matter, Statistics, Disordered Systems and...

Source: http://arxiv.org/abs/1506.00354

49
49

Sep 22, 2013
09/13

by
Yasser Roudi; John Hertz

texts

######
eye 49

######
favorite 0

######
comment 0

We derive and study dynamical TAP equations for Ising spin glasses obeying both synchronous and asynchronous dynamics using a generating functional approach. The system can have an asymmetric coupling matrix, and the external fields can be time-dependent. In the synchronously updated model, the TAP equations take the form of self consistent equations for magnetizations at time $t+1$, given the magnetizations at time $t$. In the asynchronously updated model, the TAP equations determine the time...

Source: http://arxiv.org/abs/1103.1044v1

44
44

Sep 19, 2013
09/13

by
Yasser Roudi; Alessandro Treves

texts

######
eye 44

######
favorite 0

######
comment 0

We show that symmetric n-mixture states, when they exist, are almost never stable in autoassociative networks with threshold-linear units. Only with a binary coding scheme we could find a limited region of the parameter space in which either 2-mixtures or 3-mixtures are stable attractors of the dynamics.

Source: http://arxiv.org/abs/cond-mat/0302029v1

32
32

Sep 18, 2013
09/13

by
Yasser Roudi; Alessandro Treves

texts

######
eye 32

######
favorite 0

######
comment 0

We study analytically the effect of metrically structured connectivity on the behavior of autoassociative networks. We focus on three simple rate-based model neurons: threshold-linear, binary or smoothly saturating units. For a connectivity which is short range enough the threshold-linear network shows localized retrieval states. The saturating and binary models also exhibit spatially modulated retrieval states if the highest activity level that they can achieve is above the maximum activity of...

Source: http://arxiv.org/abs/cond-mat/0505349v1

37
37

Sep 21, 2013
09/13

by
Benjamin Dunn; Yasser Roudi

texts

######
eye 37

######
favorite 0

######
comment 0

We study inference and reconstruction of couplings in a partially observed kinetic Ising model. With hidden spins, calculating the likelihood of a sequence of observed spin configurations requires performing a trace over the configurations of the hidden ones. This, as we show, can be represented as a path integral. Using this representation, we demonstrate that systematic approximate inference and learning rules can be derived using dynamical mean-field theory. Although naive mean-field theory...

Source: http://arxiv.org/abs/1301.7275v1

39
39

Sep 21, 2013
09/13

by
Yasser Roudi; Alessandro Treves

texts

######
eye 39

######
favorite 0

######
comment 0

We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of activity with large...

Source: http://arxiv.org/abs/cond-mat/0407128v1

33
33

Sep 19, 2013
09/13

by
Yasser Roudi; John A. Hertz

texts

######
eye 33

######
favorite 0

######
comment 0

There has been recent progress on the problem of inferring the structure of interactions in complex networks when they are in stationary states satisfying detailed balance, but little has been done for non-equilibrium systems. Here we introduce an approach to this problem, considering, as an example, the question of recovering the interactions in an asymmetrically-coupled, synchronously-updated Sherrington-Kirkpatrick model. We derive an exact iterative inversion algorithm and develop efficient...

Source: http://arxiv.org/abs/1009.5946v2

80
80

Sep 18, 2013
09/13

by
Yasser Roudi; Peter E. Latham

texts

######
eye 80

######
favorite 0

######
comment 0

A fundamental problem in neuroscience is understanding how working memory -- the ability to store information at intermediate timescales, like 10s of seconds -- is implemented in realistic neuronal networks. The most likely candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically. Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there are still two unanswered...

Source: http://arxiv.org/abs/0704.3005v1

34
34

Sep 23, 2013
09/13

by
Peter E. Latham; Yasser Roudi

texts

######
eye 34

######
favorite 0

######
comment 0

Correlations among spikes, both on the same neuron and across neurons, are ubiquitous in the brain. For example cross-correlograms can have large peaks, at least in the periphery, and smaller -- but still non-negligible -- ones in cortex, and auto-correlograms almost always exhibit non-trivial temporal structure at a range of timescales. Although this has been known for over forty years, it's still not clear what role these correlations play in the brain -- and, indeed, whether they play any...

Source: http://arxiv.org/abs/1109.6524v1

202
202

Oct 4, 2013
10/13

by
Alessandro Treves and Yasser Roudi

texts

######
eye 202

######
favorite 0

######
comment 0

Contents: Introduction and summary 2. The phase transition that made us mammals 3. Maps and patterns of threshold-linear units 4. Validation of the lamination hypothesis 5. What do we need DG and CA1 for? 6. Infinite recursion and the origin of cognition 7. Reducing local networks to Potts units. Lecture Notes Collection FreeScience.info ID2245 Obtained from http://people.sissa.it/~ale/TrevesRoudi.pdf http://www.freescience.info/go.php?pagename=books&id=2245

Topics: Neural Networks, "

86
86

Jul 19, 2013
07/13

by
Erik Aurell; Charles Ollion; Yasser Roudi

texts

######
eye 86

######
favorite 0

######
comment 0

We study the performance and convergence properties of the Susceptibility Propagation (SusP) algorithm for solving the Inverse Ising problem. We first study how the temperature parameter (T) in a Sherrington-Kirkpatrick model generating the data influences the performance and convergence of the algorithm. We find that at the high temperature regime (T>4), the algorithm performs well and its quality is only limited by the quality of the supplied data. In the low temperature regime (T

Source: http://arxiv.org/abs/1005.3694v1

2
2.0

Jun 30, 2018
06/18

by
Benjamin Dunn; Maria Mørreaunet; Yasser Roudi

texts

######
eye 2

######
favorite 0

######
comment 0

We study the statistics of spike trains of simultaneously recorded grid cells in freely behaving rats. We evaluate pairwise correlations between these cells and, using a generalized linear model (kinetic Ising model), study their functional connectivity. Even when we account for the covariations in firing rates due to overlapping fields, both the pairwise correlations and functional connections decay as a function of the shortest distance between the vertices of the spatial firing pattern of...

Topics: Quantitative Methods, Quantitative Biology, Neurons and Cognition

Source: http://arxiv.org/abs/1405.0044

43
43

Sep 21, 2013
09/13

by
Yasser Roudi; Erik Aurell; John Hertz

texts

######
eye 43

######
favorite 0

######
comment 0

Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this...

Source: http://arxiv.org/abs/0905.1410v1

2
2.0

Jun 29, 2018
06/18

by
Nicola Bulso; Matteo Marsili; Yasser Roudi

texts

######
eye 2

######
favorite 0

######
comment 0

We propose a method for recovering the structure of a sparse undirected graphical model when very few samples are available. The method decides about the presence or absence of bonds between pairs of variable by considering one pair at a time and using a closed form formula, analytically derived by calculating the posterior probability for every possible model explaining a two body system using Jeffreys prior. The approach does not rely on the optimisation of any cost functions and consequently...

Topics: Machine Learning, Disordered Systems and Neural Networks, Condensed Matter, Statistics

Source: http://arxiv.org/abs/1603.00952

44
44

Sep 21, 2013
09/13

by
John Hertz; Yasser Roudi; Joanna Tyrcha

texts

######
eye 44

######
favorite 0

######
comment 0

Now that spike trains from many neurons can be recorded simultaneously, there is a need for methods to decode these data to learn about the networks that these neurons are part of. One approach to this problem is to adjust the parameters of a simple model network to make its spike trains resemble the data as much as possible. The connections in the model network can then give us an idea of how the real neurons that generated the data are connected and how they influence each other. In this...

Source: http://arxiv.org/abs/1106.1752v1

66
66

Sep 22, 2013
09/13

by
Yasser Roudi; Sheila Nirenberg; Peter Latham

texts

######
eye 66

######
favorite 0

######
comment 0

One of the most critical problems we face in the study of biological systems is building accurate statistical descriptions of them. This problem has been particularly challenging because biological systems typically contain large numbers of interacting elements, which precludes the use of standard brute force approaches. Recently, though, several groups have reported that there may be an alternate strategy. The reports show that reliable statistical models can be built without knowledge of all...

Source: http://arxiv.org/abs/0811.0903v1

54
54

Sep 21, 2013
09/13

by
Yasser Roudi; Joanna Tyrcha; John Hertz

texts

######
eye 54

######
favorite 0

######
comment 0

We study pairwise Ising models for describing the statistics of multi-neuron spike trains, using data from a simulated cortical network. We explore efficient ways of finding the optimal couplings in these models and examine their statistical properties. To do this, we extract the optimal couplings for subsets of size up to 200 neurons, essentially exactly, using Boltzmann learning. We then study the quality of several approximate methods for finding the couplings by comparing their results with...

Source: http://arxiv.org/abs/0902.2885v1

45
45

Sep 21, 2013
09/13

by
Matteo Marsili; Iacopo Mastromatteo; Yasser Roudi

texts

######
eye 45

######
favorite 0

######
comment 0

The study of complex systems is limited by the fact that only few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the systems behavior. In addition, empirical data typically under sample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution...

Source: http://arxiv.org/abs/1301.3622v4

2
2.0

Jun 29, 2018
06/18

by
John A. Hertz; Yasser Roudi; Peter Sollich

texts

######
eye 2

######
favorite 0

######
comment 0

We review some of the techniques used to study the dynamics of disordered systems subject to both quenched and fast (thermal) noise. Starting from the Martin-Siggia-Rose path integral formalism for a single variable stochastic dynamics, we provide a pedagogical survey of the perturbative, i.e. diagrammatic, approach to dynamics and how this formalism can be used for studying soft spin models. We review the supersymmetric formulation of the Langevin dynamics of these models and discuss the...

Topics: Disordered Systems and Neural Networks, Statistical Mechanics, Condensed Matter

Source: http://arxiv.org/abs/1604.05775

45
45

Sep 18, 2013
09/13

by
Hong-Li Zeng; John Hertz; Yasser Roudi

texts

######
eye 45

######
favorite 0

######
comment 0

The couplings in a sparse asymmetric, asynchronous Ising network are reconstructed using an exact learning algorithm. L$_1$ regularization is used to remove the spurious weak connections that would otherwise be found by simply minimizing the minus likelihood of a finite data set. In order to see how L$_1$ regularization works in detail, we perform the calculation in several ways including (1) by iterative minimization of a cost function equal to minus the log likelihood of the data plus an...

Source: http://arxiv.org/abs/1211.3671v1

3
3.0

Jun 30, 2018
06/18

by
Benjamin Dunn; Daniel Wennberg; Ziwei Huang; Yasser Roudi

texts

######
eye 3

######
favorite 0

######
comment 0

Research on network mechanisms and coding properties of grid cells assume that the firing rate of a grid cell in each of its fields is the same. Furthermore, proposed network models predict spatial regularities in the firing of inhibitory interneurons that are inconsistent with experimental data. In this paper, by analyzing the response of grid cells recorded from rats during free navigation, we first show that there are strong variations in the mean firing rate of the fields of individual grid...

Topics: Quantitative Biology, Neurons and Cognition

Source: http://arxiv.org/abs/1701.04893

9
9.0

Jun 27, 2018
06/18

by
Stanislav S. Borysov; Yasser Roudi; Alexander V. Balatsky

texts

######
eye 9

######
favorite 0

######
comment 0

We study historical dynamics of joint equilibrium distribution of stock returns in the U.S. stock market using the Boltzmann distribution model being parametrized by external fields and pairwise couplings. Within Boltzmann learning framework for statistical inference, we analyze historical behavior of the parameters inferred using exact and approximate learning algorithms. Since the model and inference methods require use of binary variables, effect of this mapping of continuous returns to the...

Topics: Quantitative Finance, Adaptation and Self-Organizing Systems, Nonlinear Sciences, Statistical...

Source: http://arxiv.org/abs/1504.02280

96
96

Jul 20, 2013
07/13

by
Joanna Tyrcha; Yasser Roudi; Matteo Marsili; John Hertz

texts

######
eye 96

######
favorite 0

######
comment 0

Neurons subject to a common non-stationary input may exhibit a correlated firing behavior. Correlations in the statistics of neural spike trains also arise as the effect of interaction between neurons. Here we show that these two situations can be distinguished, with machine learning techniques, provided the data are rich enough. In order to do this, we study the problem of inferring a kinetic Ising model, stationary or nonstationary, from the available data. We apply the inference procedure to...

Source: http://arxiv.org/abs/1203.5673v2

39
39

Sep 21, 2013
09/13

by
Jason Sakellariou; Yasser Roudi; Marc Mezard; John Hertz

texts

######
eye 39

######
favorite 0

######
comment 0

We study how the degree of symmetry in the couplings influences the performance of three mean field methods used for solving the direct and inverse problems for generalized Sherrington-Kirkpatrick models. In this context, the direct problem is predicting the potentially time-varying magnetizations. The three theories include the first and second order Plefka expansions, referred to as naive mean field (nMF) and TAP, respectively, and a mean field theory which is exact for fully asymmetric...

Source: http://arxiv.org/abs/1106.0452v1

2
2.0

Jun 30, 2018
06/18

by
Claudia Battistin; John Hertz; Joanna Tyrcha; Yasser Roudi

texts

######
eye 2

######
favorite 0

######
comment 0

We propose a new algorithm for inferring the state of hidden spins and reconstructing the connections in a synchronous kinetic Ising model, given the observed history. Focusing on the case in which the hidden spins are conditionally independent of each other given the state of observable spins, we show that calculating the likelihood of the data can be simplified by introducing a set of replicated auxiliary spins. Belief Propagation (BP) and Susceptibility Propagation (SusP) can then be used to...

Topics: Statistical Mechanics, Data Analysis, Statistics and Probability, Physics, Disordered Systems and...

Source: http://arxiv.org/abs/1412.1727

2
2.0

Jun 29, 2018
06/18

by
Ludovica Bachschmid-Romano; Claudia Battistin; Manfred Opper; Yasser Roudi

texts

######
eye 2

######
favorite 0

######
comment 0

We describe and analyze some novel approaches for studying the dynamics of Ising spin glass models. We first briefly consider the variational approach based on minimizing the Kullback-Leibler divergence between independent trajectories and the real ones and note that this approach only coincides with the mean field equations from the saddle point approximation to the generating functional when the dynamics is defined through a logistic link function, which is the case for the kinetic Ising...

Topics: Data Analysis, Statistics and Probability, Machine Learning, Condensed Matter, Physics, Disordered...

Source: http://arxiv.org/abs/1607.08379

33
33

Sep 18, 2013
09/13

by
Hong-Li Zeng; Mikko Alava; Erik Aurell; John Hertz; Yasser Roudi

texts

######
eye 33

######
favorite 0

######
comment 0

We describe how the couplings in an asynchronous kinetic Ising model can be inferred. We consider two cases, one in which we know both the spin history and the update times and one in which we only know the spin history. For the first case, we show that one can average over all possible choices of update times to obtain a learning rule that depends only on spin correlations and can also be derived from the equations of motion for the correlations. For the second case, the same rule can be...

Source: http://arxiv.org/abs/1209.2401v3