32
32

Sep 23, 2013
09/13

by
Victor Kac; Andrey Radul

texts

######
eye 32

######
favorite 0

######
comment 0

In our paper~\cite{KR} we began a systematic study of representations of the universal central extension $\widehat{\Cal D}\/$ of the Lie algebra of differential operators on the circle. This study was continued in the paper~\cite{FKRW} in the framework of vertex algebra theory. It was shown that the associated to $\widehat {\Cal D}\/$ simple vertex algebra $W_{1+ \infty, N}\/$ with positive integral central charge $N\/$ is isomorphic to the classical vertex algebra $W (gl_N)$, which led to a...

Source: http://arxiv.org/abs/hep-th/9512150v1

3
3.0

Jun 29, 2018
06/18

by
Michael Bukatin; Steve Matthews; Andrey Radul

texts

######
eye 3

######
favorite 0

######
comment 0

Dataflow matrix machines are a powerful generalization of recurrent neural networks. They work with multiple types of arbitrary linear streams, multiple types of powerful neurons, and allow to incorporate higher-order constructions. We expect them to be useful in machine learning and probabilistic programming, and in the synthesis of dynamic systems and of deterministic and probabilistic programs.

Topics: Neural and Evolutionary Computing, Computing Research Repository

Source: http://arxiv.org/abs/1603.09002

3
3.0

Jun 29, 2018
06/18

by
Michael Bukatin; Steve Matthews; Andrey Radul

texts

######
eye 3

######
favorite 0

######
comment 0

Dataflow matrix machines are self-referential generalized recurrent neural nets. The self-referential mechanism is provided via a stream of matrices defining the connectivity and weights of the network in question. A natural question is: what should play the role of untyped lambda-calculus for this programming architecture? The proposed answer is a discipline of programming with only one kind of streams, namely the streams of appropriately shaped matrices. This yields Pure Dataflow Matrix...

Topics: Programming Languages, Computing Research Repository

Source: http://arxiv.org/abs/1610.00831

5
5.0

Jun 29, 2018
06/18

by
Michael Bukatin; Steve Matthews; Andrey Radul

texts

######
eye 5

######
favorite 0

######
comment 0

Dataflow matrix machines are a powerful generalization of recurrent neural networks. They work with multiple types of linear streams and multiple types of neurons, including higher-order neurons which dynamically update the matrix describing weights and topology of the network in question while the network is running. It seems that the power of dataflow matrix machines is sufficient for them to be a convenient general purpose programming platform. This paper explores a number of useful...

Topics: Neural and Evolutionary Computing, Computing Research Repository, Programming Languages

Source: http://arxiv.org/abs/1605.05296

5
5.0

Jun 29, 2018
06/18

by
Michael Bukatin; Steve Matthews; Andrey Radul

texts

######
eye 5

######
favorite 0

######
comment 0

Dataflow matrix machines arise naturally in the context of synchronous dataflow programming with linear streams. They can be viewed as a rather powerful generalization of recurrent neural networks. Similarly to recurrent neural networks, large classes of dataflow matrix machines are described by matrices of numbers, and therefore dataflow matrix machines can be synthesized by computing their matrices. At the same time, the evidence is fairly strong that dataflow matrix machines have sufficient...

Topics: Programming Languages, Computing Research Repository, Neural and Evolutionary Computing

Source: http://arxiv.org/abs/1606.09470