622
622

texts

######
eye 622

######
favorite 3

######
comment 1

ksiazka

favoritefavoritefavoritefavorite ( 1 reviews )

Topics: computation, theory of computation, computation theory, computation history

19
19

texts

######
eye 19

######
favorite 0

######
comment 0

computation

Topics: distributed computation, parallel computation, matlab, computation

39
39

Apr 19, 2017
04/17

by
Japanise

software

######
eye 39

######
favorite 0

######
comment 0

Vision 3D

Topic: Computation

42
42

Feb 2, 2011
02/11

by
Brian G. Mc Enery

audio

######
eye 42

######
favorite 0

######
comment 0

Aspects of the flow of computational knowledge.

Topic: computation

222
222

texts

######
eye 222

######
favorite 3

######
comment 0

Martin Davis Engines Of Logic ( 2001, W. W. Norton & Company)

Topic: computation

A theoretical treatment of what can be computed and how fast it can be done. Applications to compilers, string searching, and control circuit design will be discussed. The hierarchy of finite state machines, pushdown machines, context free grammars and Turing machines will be analyzed, along with their variations. The notions of decidability, complexity theory and a complete discussion of NP-Complete problems round out the course. Text: Introduction to the Theory of Computation, Michael Sipser....

favoritefavoritefavoritefavoritefavorite ( 8 reviews )

Topic: computation

1,291
1.3K

Oct 26, 2006
10/06

by
Joseph Lipka

texts

######
eye 1,291

######
favorite 0

######
comment 0

98
98

Oct 14, 2008
10/08

by
Brian Mc Enery PhD

audio

######
eye 98

######
favorite 1

######
comment 0

Aonró

Topics: gaelic computation, vedic mathematics, modern computation

13
13

Oct 18, 2020
10/20

by
Albert S. Jackson

texts

######
eye 13

######
favorite 0

######
comment 0

Describes how to run analog computers and design them. No Copyright renewal found in Stanford or LOC online databases.

Topics: analog computers, analogue computation, analoge computation

11
11

Jun 28, 2018
06/18

by
Anthony Lee; Nick Whiteley

texts

######
eye 11

######
favorite 0

######
comment 0

This paper concerns numerical assessment of Monte Carlo error in particle filters. We show that by keeping track of certain key features of the genealogical structure arising from resampling operations, it is possible to estimate variances of a number of standard Monte Carlo approximations which particle filters deliver. All our estimators can be computed from a single run of a particle filter with no further simulation. We establish that as the number of particles grows, our estimators are...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1509.00394

2
2.0

Jun 30, 2018
06/18

by
Matthieu Marbac; Christophe Biernacki; Vincent Vandewalle

texts

######
eye 2

######
favorite 0

######
comment 0

An extension of the latent class model is presented for clustering categorical data by relaxing the classical "class conditional independence assumption" of variables. This model consists in grouping the variables into inter-independent and intra-dependent blocks, in order to consider the main intra-class correlations. The dependency between variables grouped inside the same block of a class is taken into account by mixing two extreme distributions, which are respectively the...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1401.5684

3
3.0

Jun 30, 2018
06/18

by
Jonatan Kallus; Jose Sanchez; Alexandra Jauhiainen; Sven Nelander; Rebecka Jörnsten

texts

######
eye 3

######
favorite 0

######
comment 0

Network modeling has become increasingly popular for analyzing genomic data, to aid in the interpretation and discovery of possible mechanistic components and therapeutic targets. However, genomic-scale networks are high-dimensional models and are usually estimated from a relatively small number of samples. Therefore, their usefulness is hampered by estimation instability. In addition, the complexity of the models is controlled by one or more penalization (tuning) parameters where small changes...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1702.07685

3
3.0

Jun 30, 2018
06/18

by
Ajay Jasra; Kengo Kamatani; Kody Law; Yan Zhou

texts

######
eye 3

######
favorite 0

######
comment 0

In this article we consider computing expectations w.r.t.~probability laws associated to a certain class of stochastic systems. In order to achieve such a task, one must not only resort to numerical approximation of the expectation, but also to a biased discretization of the associated probability. We are concerned with the situation for which the discretization is required in multiple dimensions, for instance in space and time. In such contexts, it is known that the multi-index Monte Carlo...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1704.00117

2
2.0

Jun 30, 2018
06/18

by
Wei Pan; Xinming An; Qing Yang

texts

######
eye 2

######
favorite 0

######
comment 0

Any empirical data can be approximated to one of Pearson distributions using the first four moments of the data (Elderton and Johnson, 1969; Pearson, 1895; Solomon and Stephens, 1978). Thus, Pearson distributions made statistical analysis possible for data with unknown distributions. There are both extant old-fashioned in-print tables (Pearson and Hartley, 1972) and contemporary computer programs (Amos and Daniel, 1971; Bouver and Bargmann, 1974; Bowman and Shenton, 1979; Davis and Stephens,...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1704.02706

2
2.0

Jun 30, 2018
06/18

by
Matthew M. Graham; Amos J. Storkey

texts

######
eye 2

######
favorite 0

######
comment 0

Hamiltonian Monte Carlo (HMC) is a powerful Markov chain Monte Carlo (MCMC) method for performing approximate inference in complex probabilistic models of continuous variables. In common with many MCMC methods, however, the standard HMC approach performs poorly in distributions with multiple isolated modes. We present a method for augmenting the Hamiltonian system with an extra continuous temperature control variable which allows the dynamic to bridge between sampling a complex target...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1704.03338

2
2.0

Jun 30, 2018
06/18

by
Asad Hasan; Wang Zhiyu; Alireza S. Mahani

texts

######
eye 2

######
favorite 0

######
comment 0

We present R package mnlogit for training multinomial logistic regression models, particularly those involving a large number of classes and features. Compared to existing software, mnlogit offers speedups of 10x-50x for modestly sized problems and more than 100x for larger problems. Running mnlogit in parallel mode on a multicore machine gives an additional 2x-4x speedup on up to 8 processor cores. Computational efficiency is achieved by drastically speeding up calculation of the...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1404.3177

2
2.0

Jun 30, 2018
06/18

by
Jesse Windle; Nicholas G. Polson; James G. Scott

texts

######
eye 2

######
favorite 0

######
comment 0

Efficiently sampling from the P\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1405.0506

3
3.0

Jun 30, 2018
06/18

by
Paul Kabaila

texts

######
eye 3

######
favorite 0

######
comment 0

Cranley and Patterson put forward the following randomization as the basis for the estimation of the error of a lattice rule for an integral of a one-periodic function over the unit cube in s dimensions. The lattice rule is randomized using independent random shifts in each coordinate direction that are uniformly distributed in the interval [0,1]. This randomized lattice rule results in an unbiased estimator of the multiple integral. However, in practice, random variables that are independent...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1406.0225

5
5.0

Jun 30, 2018
06/18

by
Henry Scharf; Ryan Elmore; Kenny Gruchalla

texts

######
eye 5

######
favorite 0

######
comment 0

The volume of data and the velocity with which it is being generated by com- putational experiments on high performance computing (HPC) systems is quickly outpacing our ability to effectively store this information in its full fidelity. There- fore, it is critically important to identify and study compression methodologies that retain as much information as possible, particularly in the most salient regions of the simulation space. In this paper, we cast this in terms of a general...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1407.2954

4
4.0

Jun 29, 2018
06/18

by
Rahim Alhamzawi

texts

######
eye 4

######
favorite 0

######
comment 0

Since the pioneering work by Koenker and Bassett (1978), quantile regression models and its applications have become increasingly popular and important for research in many areas. In this paper, a random effects ordinal quantile regression model is proposed for analysis of longitudinal data with ordinal outcome of interest. An efficient Gibbs sampling algorithm was derived for fitting the model to the data based on a location scale mixture representation of the skewed double exponential...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1603.00297

4
4.0

Jun 29, 2018
06/18

by
K. Konakli; B. Sudret

texts

######
eye 4

######
favorite 0

######
comment 0

Engineering and applied sciences use models of increasing complexity to simulate the behaviour of manufactured and physical systems. Propagation of uncertainties from the input to a response quantity of interest through such models may become intractable in cases when a single simulation is time demanding. Particularly challenging is the reliability analysis of systems represented by computationally costly models, because of the large number of model evaluations that are typically required to...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1606.08577

2
2.0

Jun 29, 2018
06/18

by
Shifeng Xiong

texts

######
eye 2

######
favorite 0

######
comment 0

Optimization problems with both control variables and environmental variables arise in many fields. This paper introduces a framework of personalized optimization to han- dle such problems. Unlike traditional robust optimization, personalized optimization devotes to finding a series of optimal control variables for different values of environmental variables. Therefore, the solution from personalized optimization consists of optimal surfaces defined on the domain of the environmental variables....

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1607.01664

2
2.0

Jun 29, 2018
06/18

by
Diaa Al Mohamad; Michel Broniatowski

texts

######
eye 2

######
favorite 0

######
comment 0

Estimators derived from an EM algorithm are not robust since they are based on the maximization of the likelihood function. We propose a proximal-point algorithm based on the EM algorithm which aim to minimize a divergence criterion. Resulting estimators are generally robust against outliers and misspecification. An EM-type proximal-point algorithm is also introduced in order to produce robust estimators for mixture models. Convergence properties of the two algorithms are treated. We relax an...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1607.02472

2
2.0

Jun 30, 2018
06/18

by
Chris Sherlock; Andrew Golightly; Colin Gillespie

texts

######
eye 2

######
favorite 0

######
comment 0

We consider the problem of efficiently performing simulation and inference for stochastic kinetic models. Whilst it is possible to work directly with the resulting Markov jump process, computational cost can be prohibitive for networks of realistic size and complexity. In this paper, we consider an inference scheme based on a novel hybrid simulator that classifies reactions as either "fast" or "slow" with fast reactions evolving as a continuous Markov process whilst the...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1402.6602

2
2.0

Jun 30, 2018
06/18

by
Sangin Lee; Patrick Breheny

texts

######
eye 2

######
favorite 0

######
comment 0

We consider approaches for improving the efficiency of algorithms for fitting nonconvex penalized regression models such as SCAD and MCP in high dimensions. In particular, we develop rules for discarding variables during cyclic coordinate descent. This dimension reduction leads to a substantial improvement in the speed of these algorithms for high-dimensional problems. The rules we propose here eliminate a substantial fraction of the variables from the coordinate descent algorithm. Violations...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1403.2963

2
2.0

Jun 30, 2018
06/18

by
J. N. Corcoran; D. Jennings

texts

######
eye 2

######
favorite 0

######
comment 0

"Particle methods" are sequential Monte Carlo algorithms, typically involving importance sampling, that are used to estimate and sample from joint and marginal densities from a collection of a, presumably increasing, number of random variables. In particular, a particle filter aims to estimate the current state $X_{n}$ of a stochastic system that is not directly observable by estimating a posterior distribution $\pi(x_{n}|y_{1},y_{2}, \ldots, y_{n})$ where the $\{Y_{n}\}$ are...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1407.4414

2
2.0

Jun 30, 2018
06/18

by
Anestis Touloumis

texts

######
eye 2

######
favorite 0

######
comment 0

The R package multgee implements the local odds ratios generalized estimating equations (GEE) approach proposed by Touloumis et al. (2013), a GEE approach for correlated multinomial responses that circumvents theoretical and practical limitations of the GEE method. A main strength of multgee is that it provides GEE routines for both ordinal (ordLORgee) and nominal (nomLORgee) responses, while relevant softwares in R and SAS are restricted to ordinal responses under a marginal cumulative link...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1410.5232

2
2.0

Jun 30, 2018
06/18

by
G. S. Rodrigues; David J. Nott; S. A. Sisson

texts

######
eye 2

######
favorite 0

######
comment 0

We propose a novel Bayesian nonparametric method for hierarchical modelling on a set of related density functions, where grouped data in the form of samples from each density function are available. Borrowing strength across the groups is a major challenge in this context. To address this problem, we introduce a hierarchically structured prior, defined over a set of univariate density functions, using convenient transformations of Gaussian processes. Inference is performed through approximate...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1410.8276

2
2.0

Jun 29, 2018
06/18

by
Nathaniel E. Helwig; Ping Ma

texts

######
eye 2

######
favorite 0

######
comment 0

In the current era of big data, researchers routinely collect and analyze data of super-large sample sizes. Data-oriented statistical methods have been developed to extract information from super-large data. Smoothing spline ANOVA (SSANOVA) is a promising approach for extracting information from noisy data; however, the heavy computational cost of SSANOVA hinders its wide application. In this paper, we propose a new algorithm for fitting SSANOVA models to super-large sample data. In this...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1602.05208

2
2.0

Jun 29, 2018
06/18

by
Roberto Fontana; Fabio Rapallo

texts

######
eye 2

######
favorite 0

######
comment 0

In this work we present the results of several simulations on main-effect factorial designs. The goal of such simulations is to investigate the connections between the $D$-optimality of a design and its geometrical structure. By means of a combinatorial object, namely the circuit basis of the design matrix, we show that it is possible to define a simple index that exhibits strong connections with the $D$-optimality.

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1604.04582

3
3.0

Jun 29, 2018
06/18

by
Aliaksandr Hubin; Geir Storvik

texts

######
eye 3

######
favorite 0

######
comment 0

Generalized linear mixed models (GLMM) are used for inference and prediction in a wide range of different applications providing a powerful scientific tool for the researchers and analysts coming from different fields. In most of these fields more and more sources of data are becoming available introducing a variety of hypothetical explanatory variables for these models to be considered. Selection of an optimal combination of these variables is thus becoming crucial. In a Bayesian setting, the...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1604.06398

2
2.0

Jun 29, 2018
06/18

by
Arthur White; Thomas Brendan Murphy

texts

######
eye 2

######
favorite 0

######
comment 0

For several years, model-based clustering methods have successfully tackled many of the challenges presented by data-analysts. However, as the scope of data analysis has evolved, some problems may be beyond the standard mixture model framework. One such problem is when observations in a dataset come from overlapping clusters, whereby different clusters will possess similar parameters for multiple variables. In this setting, mixed membership models, a soft clustering approach whereby...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1608.03302

3
3.0

Jun 30, 2018
06/18

by
Virgilio Gómez-Rubio; Håvard Rue

texts

######
eye 3

######
favorite 0

######
comment 0

The Integrated Nested Laplace Approximation (INLA) has established itself as a widely used method for approximate inference on Bayesian hierarchical models which can be represented as a latent Gaussian model (LGM). INLA is based on producing an accurate approximation to the posterior marginal distributions of the parameters in the model and some other quantities of interest by using repeated approximations to intermediate distributions and integrals that appear in the computation of the...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1701.07844

6
6.0

Jun 27, 2018
06/18

by
Jason Xu; Vladimir N. Minin

texts

######
eye 6

######
favorite 0

######
comment 0

Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring a large integration step to...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1503.02644

2
2.0

Jun 28, 2018
06/18

by
Gavin A. Whitaker; Andrew Golightly; Richard J. Boys; Chris Sherlock

texts

######
eye 2

######
favorite 0

######
comment 0

We consider the task of generating discrete-time realisations of a nonlinear multivariate diffusion process satisfying an It\^o stochastic differential equation conditional on an observation taken at a fixed future time-point. Such realisations are typically termed diffusion bridges. Since, in general, no closed form expression exists for the transition densities of the process of interest, a widely adopted solution works with the Euler-Maruyama approximation, by replacing the intractable...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1509.09120

3
3.0

Jun 28, 2018
06/18

by
François Bachoc; Jean-Marc Martinez; Karim Ammar

texts

######
eye 3

######
favorite 0

######
comment 0

It is now common practice in nuclear engineering to base extensive studies on numerical computer models. These studies require to run computer codes in potentially thousands of numerical configurations and without expert individual controls on the computational and physical aspects of each simulations.In this paper, we compare different statistical metamodeling techniques and show how metamodels can help to improve the global behaviour of codes in these extensive studies. We consider the...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1511.03046

2
2.0

Jun 29, 2018
06/18

by
Ajay Jasra; Kengo Kamatani; Prince Prepah Osei; Yan Zhou

texts

######
eye 2

######
favorite 0

######
comment 0

In this article we introduce two new estimates of the normalizing constant (or marginal likelihood) for partially observed diffusion (POD) processes, with discrete observations. One estimate is biased but non-negative and the other is unbiased but not almost surely non-negative. Our method uses the multilevel particle filter of Jasra et al (2015). We show that, under assumptions, for Euler discretized PODs and a given $\varepsilon>0$. in order to obtain a mean square error (MSE) of...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1605.04963

2
2.0

Jun 29, 2018
06/18

by
Víctor Elvira; Luca Martino; David Luengo; Mónica F. Bugallo

texts

######
eye 2

######
favorite 0

######
comment 0

Multiple Importance Sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a trade-off between variance...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1609.04740

2
2.0

Jun 30, 2018
06/18

by
Bruna Gregory Palm; Fábio M. Bayer

texts

######
eye 2

######
favorite 0

######
comment 0

We consider the issue of performing accurate small sample inference in beta autoregressive moving average model, which is useful for modeling and forecasting continuous variables that assumes values in the interval $(0,1)$. The inferences based on conditional maximum likelihood estimation have good asymptotic properties, but their performances in small samples may be poor. This way, we propose bootstrap bias corrections of the point estimators and different bootstrap strategies for confidence...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1702.04391

2
2.0

Jun 30, 2018
06/18

by
D. Andrew Brown; Christopher S. McMahan

texts

######
eye 2

######
favorite 0

######
comment 0

Gaussian Markov random fields (GMRFs) are popular for modeling temporal or spatial dependence in large areal datasets due to their ease of interpretation and computational convenience afforded by conditional independence and their sparse precision matrices needed for random variable generation. Using such models inside a Markov chain Monte Carlo algorithm requires repeatedly simulating random fields. This is a nontrivial issue, especially when the full conditional precision matrix depends on...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1702.05518

7
7.0

Jun 28, 2018
06/18

by
Zachary D. Weller

texts

######
eye 7

######
favorite 0

######
comment 0

An important step of modeling spatially-referenced data is appropriately specifying the second order properties of the random field. A scientist developing a model for spatial data has a number of options regarding the nature of the dependence between observations. One of these options is deciding whether or not the dependence between observations depends on direction, or, in other words, whether or not the spatial covariance function is isotropic. Isotropy implies that spatial dependence is a...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1509.07185

3
3.0

Jun 29, 2018
06/18

by
Alexander Gribov

texts

######
eye 3

######
favorite 0

######
comment 0

One of the most efficient ways to produce unconditional simulations is with the kernel convolution using fast Fourier transform (FFT) [1]. However, when data is located on a surface, this approach is not efficient because data needs to be processed in a three-dimensional enclosing box. This paper describes a novel approach based on integer transformation to reduce the volume of the enclosing box.

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1601.04065

5
5.0

Jun 27, 2018
06/18

by
Colin Fox; Albert Parker

texts

######
eye 5

######
favorite 0

######
comment 0

Standard Gibbs sampling applied to a multivariate normal distribution with a specified precision matrix is equivalent in fundamental ways to the Gauss-Seidel iterative solution of linear equations in the precision matrix. Specifically, the iteration operators, the conditions under which convergence occurs, and geometric convergence factors (and rates) are identical. These results hold for arbitrary matrix splittings from classical iterative methods in numerical linear algebra giving easy access...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1505.03512

2
2.0

Jun 30, 2018
06/18

by
Dan Crisan; Joaquin Miguez; Gonzalo Rios

texts

######
eye 2

######
favorite 0

######
comment 0

We investigate the use of possibly the simplest scheme for the parallelisation of the standard particle filter, that consists in splitting the computational budget into $M$ fully independent particle filters with $N$ particles each, and then obtaining the desired estimators by averaging over the $M$ independent outcomes of the filters. This approach minimises the parallelisation overhead yet displays highly desirable theoretical properties. Under very mild assumptions, we analyse the mean...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1407.8071

11
11

Nov 10, 2010
11/10

by
Brian G. Mc Enery

audio

######
eye 11

######
favorite 0

######
comment 0

Introducing a method for teaching an integrated approach to computation and language.

Topics: computation, language

14
14

Jan 6, 2011
01/11

by
Brian G. Mc Enery

audio

######
eye 14

######
favorite 0

######
comment 0

Managing the global economy using invincible computation.

Topics: invincible, computation

12
12

Jun 28, 2018
06/18

by
Óli Páll Geirsson; Birgir Hrafnkelsson; Daniel Simpson; Helgi Sigurðarson

texts

######
eye 12

######
favorite 0

######
comment 0

A novel computationally efficient Markov chain Monte Carlo (MCMC) scheme for latent Gaussian models (LGMs) is proposed in this paper. The sampling scheme is a two block Gibbs sampling scheme designed to exploit the model structure of LGMs. We refer to the proposed sampling scheme as the MCMC split sampler. The principle idea behind the MCMC split sampler is to split the latent Gaussian parameters into two vectors. The former vector consists of latent parameters which appear in the data density...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1506.06285

9
9.0

Jun 28, 2018
06/18

by
Lakshmi Roychowdhury

texts

######
eye 9

######
favorite 0

######
comment 0

Quantization of a probability distribution refers to the idea of estimating a given probability by a discrete probability supported by a finite set. Let $P$ be a Borel probability measure on $\mathbb R$ such that $P=\frac 1 4 P\circ S_1^{-1} +\frac 3 4 P\circ S_2^{-1}$, where $S_1$ and $S_2$ are two similarity mappings on $\mathbb R$ such that $S_1(x)=\frac 1 4 x $ and $S_2(x)=\frac 1 2 x +\frac 12$ for all $x\in \mathbb R$. Such a probability measure $P$ has support the Cantor set generated by...

Topics: Statistics, Computation

Source: http://arxiv.org/abs/1512.00379

4
4.0

Jun 29, 2018
06/18

by
Yulai Cong; Bo Chen; Mingyuan Zhou

texts

######
eye 4

######
favorite 0

######
comment 0

We introduce a fast and easy-to-implement simulation algorithm for a multivariate normal distribution truncated on the intersection of a set of hyperplanes, and further generalize it to efficiently simulate random variables from a multivariate normal distribution whose covariance (precision) matrix can be decomposed as a positive-definite matrix minus (plus) a low-rank symmetric matrix. Example results illustrate the correctness and efficiency of the proposed simulation algorithms.

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1607.04751

16
16

Jun 27, 2018
06/18

by
Tim Benham; Qibin Duan; Dirk P. Kroese; Benoit Liquet

texts

######
eye 16

######
favorite 0

######
comment 0

The cross-entropy (CE) method is simple and versatile technique for optimization, based on Kullback-Leibler (or cross-entropy) minimization. The method can be applied to a wide range of optimization tasks, including continuous, discrete, mixed and constrained optimization problems. The new package CEoptim provides the R implementation of the CE method for optimization. We describe the general CE methodology for optimization and well as some useful modifications. The usage and efficacy of...

Topics: Computation, Statistics

Source: http://arxiv.org/abs/1503.01842