Nowcasting is a trending subset of numerical weather prediction that aims to produce a highly accurate analysis of current conditions along with a short-term forecast. One of the greatest challenges to a nowcast system operating in data-sparse regions is that of accurately forecasting clouds. Clouds significantly impact a variety of operations, particularly intelligence, surveillance and reconnaissance. A prototype nowcast system is developed and tested on a case of summertime stratus clouds...
Topics: DTIC Archive, Zoufaly,Sean L, NAVAL POSTGRADUATE SCHOOL MONTEREY CA MONTEREY United States, WEATHER...
This report summarizes the research done under FA8750-16-2-0173. This research advanced understanding of bandit algorithms and exploration in Markov Decision Processes (MDPs). New algorithms and theory were proposed for bandits with periodic payoff multipliers and arms with costs. Exploration and transfer learning algorithms were evaluated for MDPs.
Topics: DTIC Archive, Parr,Ronald, Duke University Durham United States, Machine learning, ALGORITHMS,...
The asymptotic behavior of parameter estimates and the identification and modeling of dynamical systems are investigated. Measures of the relevant information in a given sequence of observations are defined and shown to possess useful properties, such as the metric property on the parameter set. The convergence of maximum likelihood and related Bayesian estimates for general observation sequences is investigated. The situation where the true parameter is not a member of a given parameter set is...
Topics: DTIC Archive, Baram, Yoram, MASSACHUSETTS INST OF TECH CAMBRIDGE ELECTRONIC SYSTEMS LAB,...
This final report described the ASSERT project 'Detection and Classification of Synthetic Aperture Radar Targets' associated with the URI Automatic Target Recognition (ATR) project sponsored by DARPA. The main goal of this ASSERT project together with the URI-ATR project is to develop detection and classification algorithms for automatic target recognition. For the ASSERT project, we have focused on the use of Bayesian probabilistic reasoning approach to fuse multiple target feature data for...
Topics: DTIC Archive, Chang, K. C., GEORGE MASON UNIV FAIRFAX VA CENTER OFEXCELLENCE IN COMMAND CONTROL...
Starting with functional description of physical mechanisms we were able to derive the standard probabilistic properties of Bayesian networks and to show: (1) how the effects of unanticipated actions can be predicted from the net-work topology, (2) how qualitative causal judgments can be integrated with statistical data, (3) how actions interact with observations, (4) how counterfactuals sentences can be interpreted and evaluated, (5) how explanations and single-event causation can be defined...
Topics: DTIC Archive, Pearl, Judea, CALIFORNIA UNIV LOS ANGELES DEPT OF COMPUTER SCIENCE, ALGORITHMS, REAL...
Designed and developed a cognitive architecture to model the adversary which forms the basis for the Adversary Intent Inferencing (AII) Module. Designed, developed, and implemented the AII Module based on both Bayesian Networks and Bayesian Knowledge Bases for adversarial modeling, course of action prediction, explanation, and inference of adversary intent. All functioning in both wintel and Unix environments. Integrated AII module into prototype system for modeling and predicting the adversary...
Topics: DTIC Archive, Santos, Jr, Eugene, CONNECTICUT UNIV STORRS OFFICE FOR SPONSORED PROGRAMS, *COMPUTER...
The goal in this effort is to automatically detect deception by an individual or expert who is contributing to an information knowledge-base consisting of multiple experts. Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources, including electronic sources such as knowledge-based diagnostic or decision support systems, through techniques like data mining. As a decision maker's sources become...
Topics: DTIC Archive, Santos, Jr, Eugene, CONNECTICUT UNIV STORRS LAB FOR COMPUTER SCIENCE RESEARCH,...
The subjectivist, Bayesian paradigm for a decision-maker is described. It is shown how the notion of utility, and the principle of maximizing expected utility both depend on the description of uncertainty through probability. The justification for the necessity of this description due to de Finetti is outlined. The twin, practical problems of the evaluation of the decision-maker's probabilities and utilities are discussed. Probability, as used in the paradigm, is a subjectivist notion which is...
Topics: DTIC Archive, Lindley,Dennis V, WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER, *STOCHASTIC...
This text introduces a variety of standard statistical methods applicable to reliability data analysis. It is aimed at the non-specialist or manager and as such is primarily an applications guide. Methods addressed include analysis of variance, confidence interval estimation, goodness-of-fit tests, Weibull plotting, sampling inspection and regression and covariance analysis. Examples cover use of both parametric and nonparametric methods.
Topics: DTIC Archive, Dey, Kieron A, RELIABILITY ANALYSIS CENTER GRIFFISS AFB NY,...
Two new selection procedures, called nonrandomized and randomized Bayes-P* procedures are defined for selecting a small nonempty subset of k populations which contains the best population. It is shown that these procedures have some optimal properties. If we restrict attention to the class D(D*) of all nonrandomized (randomized) selection procedures, which satisfy the PP*-condition, that is the posterior probability of a correct selection, for any given observation X = x, is not less than P*, a...
Topics: DTIC Archive, Gupta,S S, PURDUE UNIV LAFAYETTE IN DEPT OF STATISTICS, *Population(Mathematics),...
An unknown number, N, of errors exist in a certain product, for example, defects in a production lot, errors in a manuscript, or bugs in a computer program. I inspectors with possibly different competencies are to be put to work to find the errors. How should the inspection be organized, and what is a good estimate of the undetected errors (or of N)? This problem is similar to the capture-recapture sampling problem of population biology, assuming a closed population and a parallel search...
Topics: DTIC Archive, Jewell,W S, CALIFORNIA UNIV BERKELEY OPERATIONS RESEARCH CENTER, *MATHEMATICAL...
This thesis describes a Bayesian method to determine the number of samples needed to estimate a proportion or probability with 95% confidence when prior bounds are placed on that proportion. It uses the Uniform (a,b) distribution as the prior, and develops a computer program and tables to find the sample size. Tables and examples are also given to compare these results with other approaches to finding sample size. The improvement that can be obtained with this method's fewer samples, and...
Topics: DTIC Archive, Floropoulos, Theodore C, NAVAL POSTGRADUATE SCHOOL MONTEREY CA, *BAYES THEOREM,...
People's strategies on probabilistic inference word problems were investigated in an attempt to determine which of three theories explains their neglect of base rate information when estimating the probability of a hypothesis. These word problems present a base rate or prior probability (p(h)), some evidence (e) that typically conflicts with the prior expectation, and information on the reliability of the evidence, which is stated as p(e/h), the conditional probability of the evidence being...
Topics: DTIC Archive, Hamm, Robert M, COLORADO UNIV AT BOULDER INST OF COGNITIVE SCIENCE, *HYPOTHESES,...
This report develops and demonstrates algorithms for representing and displaying similarity data using three established cognitive models. The first representational model, multidimensional scaling, represents objects as points in a coordinate space so that similar objects lie near each other. The second representational model, the additive tree, represents objects as terminal nodes in a tree so that the similarity of two objects is modelled by length of the path between them. The third...
Topics: DTIC Archive, Lee, Michael D., DEFENCE SCIENCE AND TECHNOLOGY ORGANISATION SALISBURY (AUSTRALIA),...
This project used analytical and experimental techniques derived from signal detection theory to quantify the decision making performance of individuals and teams. The basic decision task was to decide on the presence or absence of signals in noise. The project's experiments studied how individual and team performance depends on member signal-to-noise ratio, correlation among member inputs, efficiency of member updating of likelihood estimates, and constraints on member interaction and...
Topics: DTIC Archive, Fishchler, Ira, FLORIDA UNIV GAINESVILLE DEPT OF PSYCHOLOGY, *DECISION MAKING,...
In the past three years we have investigated computerized methods for analyses and interpretation of breast MR images. We investigated an automatic method for correcting intensity inhomogenieity artifacts in breast MR images. We developed a fuzzy c-means (FOM) based method for 3D lesion segmentation which is a key procedure in computerized interpretation of breast MR images including differential diagnosis and assessment of response to therapy. The computerized segmentation yielded 07% of 121...
Topics: DTIC Archive, Chen, Weijie, CHICAGO UNIV IL, *IMAGES, *BREAST CANCER, DATA BASES, NEURAL NETS,...
Thurstone's Law of Comparative Judgment provides a method to convert subjective paired comparisons into one-dimensional quality scores. Applications include judging quality of different image reconstructions, or different products, or different web search results, etc. This tutorial covers the popular Thurstone-Mosteller Case V model and the Bradley-Terry logistic variant. We describe three approaches to model- fitting: standard least-squares, maximum likelihood, and Bayesian approaches. This...
Topics: DTIC Archive, WASHINGTON UNIV SEATTLE DEPT OF ELECTRICAL ENGINEERING, *SOFTWARE TOOLS, *SCORING,...
The anomalous gravity field of Venus shows high correlation with surface features revealed by radar. We extract gravity models from the Doppler tracking data from the Pioneer Venus Orbiter (PVO) by means of a two-step process. In the first step, we solve the nonlinear spacecraft state estimation problem using a Kalman filter-smoother. The Kalman filter was evaluated through simulations. This evaluation and some unusual features of the filter are discussed. In the second step, we perform a...
Topics: NASA Technical Reports Server (NTRS), DOPPLER RADAR, GRAVITY ANOMALIES, MATHEMATICAL MODELS,...
DTS is a decision-theoretic scheduler, built on top of a flexible toolkit -- this paper focuses on how the toolkit might be reused in future NASA mission schedulers. The toolkit includes a user-customizable scheduling interface, and a 'Just-For-You' optimization engine. The customizable interface is built on two metaphors: objects and dynamic graphs. Objects help to structure problem specifications and related data, while dynamic graphs simplify the specification of graphical schedule editors...
Topics: NASA Technical Reports Server (NTRS), APPLICATIONS PROGRAMS (COMPUTERS), DATA BASES, DECISION...
There are no author-identified significant results in this report.
Topics: NASA Technical Reports Server (NTRS), BAYES THEOREM, COMPUTER PROGRAMS, IMAGE PROCESSING, LARGE...
The influence of physical constraints are investigated which may be approximately satisfied by the Earth's liquid core on models of the geomagnetic main field and its secular variation. A previous report describes the methodology used to incorporate nonlinear equations of constraint into the main field model. The application of that methodology to the GSFC 12/83 field model to test the frozen-flux hypothesis and the usefulness of incorporating magnetohydrodynamic constraints for obtaining...
Topics: NASA Technical Reports Server (NTRS), CONSTRAINTS, GEOMAGNETISM, MAGNETIC FLUX,...
A program to demonstrate the long term reliability of NiH2 cells in low Earth orbits (LEO) and support use in mid-altitude orbits (MAO) was initiated. Both 3.5 and 4.5 inch diameter nickel hydrogen cells are included in the test plan. Cells from all U.S. vendors are to be tested. The tests will be performed at -5 and 10 C at 40 and 60% DOD for LEO orbit and 10 C and 80% DOD for MAO orbit simulations. The goals of the testing are 20,000 cycles at 60% DOD and 30,000 cycles at 40% DOD. Cells are...
Topics: NASA Technical Reports Server (NTRS), COMPONENT RELIABILITY, LONG TERM EFFECTS, LOW EARTH ORBITS,...
The problem of learning in pattern recognition using imperfectly labeled patterns is considered. The performance of the Bayes and nearest neighbor classifiers with imperfect labels is discussed using a probabilistic model for the mislabeling of the training patterns. Schemes for training the classifier using both parametric and non parametric techniques are presented. Methods for the correction of imperfect labels were developed. To gain an understanding of the learning process, expressions are...
Topics: NASA Technical Reports Server (NTRS), LEARNING THEORY, PATTERN RECOGNITION, TRANSFER OF TRAINING,...
There are no author-identified significant results in this report.
Topics: NASA Technical Reports Server (NTRS), CALIFORNIA, FORESTS, IMAGE PROCESSING, LAND USE, MOUNTAINS,...
A Bayesian technique for stratified proportion estimation and a sampling based on minimizing the mean squared error of this estimator were developed and tested on LANDSAT multispectral scanner data using the beta density function to model the prior distribution in the two-class case. An extention of this procedure to the k-class case is considered. A generalization of the beta function is shown to be a density function for the general case which allows the procedure to be extended.
Topics: NASA Technical Reports Server (NTRS), AGRISTARS PROJECT, BAYES THEOREM, CROP INVENTORIES, GRAINS...
Liquid rocket engine technology has been characterized by the development of complex systems containing large number of subsystems, components, and parts. The trend to even larger and more complex system is continuing. The liquid rocket engineers have been focusing mainly on performance driven designs to increase payload delivery of a launch vehicle for a given mission. In otherwords, although the failure of a single inexpensive part or component may cause the failure of the system, reliability...
Topics: NASA Technical Reports Server (NTRS), EVALUATION, LAUNCH VEHICLES, LIQUID PROPELLANT ROCKET...
The long-term goals of this work are to improve ocean acoustic reverberation modeling and sonar performance predictions in shallow waters by developing inversion procedures to estimate seabed scattering and geoacoustic properties with uncertainties, as well as investigating the importance of various scattering processes. Important issues include: investigating the angular and frequency dependence of scattering (defining the scattering kernel), determining the dependence of scattering on...
Topics: DTIC Archive, PENNSYLVANIA STATE UNIV STATE COLLEGE APPLIED RESEARCH LAB, *BAYES THEOREM,...
This paper lays out the rationale and implementation of student modeling and updating in the HYDRIVE intelligent tutoring system (ITS) for aircraft hydraulic systems. An epistemic level of modeling concerns the plans and goals students are using to guide their problem-solving, as inferred from specific actions in specific contexts. These results update a student model constructed around more broadly defined aspects of system understanding, strategic knowledge, and procedural skills. Meant to...
Topics: DTIC Archive, Gitomer, Drew H, EDUCATIONAL TESTING SERVICE PRINCETON NJ, *SKILLS, *TEACHING...
Problems of the robustness of adaptive control procedures against deviations of the empirical phenomena form the assumed models are of high practical and theoretical importance. The present paper studies the robustness of Bayes adaptive control of two-echelon inventory systems relative to erroneous assumptions concerning (i) the initial control parameters; (ii) the stationarity of the demand distribution and (iii) the nature of the demand distribution. (Author)
Topics: DTIC Archive, Zacks,S, GEORGE WASHINGTON UNIV WASHINGTON D C PROGRAM IN LOGISTICS, *ADAPTIVE...
In electro-neurophysiology, single-trial brain responses to a sensory stimulus or a motor act are commonly assumed to result from the linear superposition of a stereotypic event-related signal (e.g. the event-related potential or ERP) that is invariant across trials and some ongoing brain activity often referred to as noise. To extract the signal, one performs an ensemble average of the brain responses over many identical trials to attenuate the noise. To date, h s simple signal-plus-noise...
Topics: NASA Technical Reports Server (NTRS), BRAIN, BAYES THEOREM, NEUROPHYSIOLOGY, NEUROLOGY, MENTAL...
This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can...
Topics: NASA Technical Reports Server (NTRS), STRUCTURAL HEALTH MONITORING, DAMAGE ASSESSMENT, STRUCTURAL...
The implementation and evaluation of an efficient method for estimating safe aircraft maneuvering envelopes are discussed. A Bayesian approach is used to produce a deterministic algorithm for estimating aerodynamic system parameters from existing noisy sensor measurements, which are then used to estimate the trim envelope through efficient high- fidelity model-based computations of attainable equilibrium sets. The safe maneuverability limitations are extended beyond the trim envelope through a...
Topics: NASA Technical Reports Server (NTRS), FLIGHT SIMULATORS, CIVIL AVIATION, OPTIMAL CONTROL,...
A formulation of the problem of making decisions concerning the state of nonstationary stochastic processes is given. An optimal decision rule, for the case in which the stochastic process is independent of the decisions made, is derived. It is shown that this rule is a generalization of the Bayesian likelihood ratio test; and an analog to Wald's sequential likelihood ratio test is given, in which the optimal thresholds may vary with time.
Topics: NASA Technical Reports Server (NTRS), DECISION MAKING, STOCHASTIC PROCESSES, BAYES THEOREM, MISSILE...
The long-term goal of this project is to develop methods to characterize and display the uncertainty in target state estimates that result from uncertainty in environmental estimates. The objectives of this project for FY02 were (1) were to work with ARL/UT to develop methods for applying the likelihood functions to the Echo Tracker Classifier (ETC) process and to determine the effect of environmental uncertainty on the likelihood ratio functions so produced, and (2) to develop methods for...
Topics: DTIC Archive, METRON INC RESTON VA, *SONAR, BAYES THEOREM, TARGETING, UNCERTAINTY
Discovery of new knowledge, that is, knowledge that we do not already possess, is the focus of this research. This problem can be formulated as an inverse problem, where the new knowledge can be represented by the parameters of a black box model. The solution can then be viewed as the culmination of a sequence of problem solving steps: search, composition, integration and discovery. A well designed cognitive agent capable of learning, adaptation and optimization can accomplish this task.
Topics: DTIC Archive, Vemuri, V R, CALIFORNIA UNIV DAVIS, *LEARNING MACHINES, *INFORMATION RETRIEVAL, *DATA...
Airspace encounter models, covering close encounter situations that may occur after standard separation assurance has been lost, are a critical component in the safety assessment of aviation procedures and collision avoidance systems. Of particular relevance to unmanned aircraft systems (UAS) is the potential for encountering general aviation aircraft that are flying under visual flight rules (VFR) and which may not be in contact with air traffic control. In response to the need to develop a...
Topics: DTIC Archive, Kochenderfer, M J, MASSACHUSETTS INST OF TECH LEXINGTON LINCOLN LAB, *AERONAUTICS,...
This project explored several problems in the areas of reinforcement learning, probabilistic planning, and transfer learning. In particular, it studied Bayesian Optimization for model-based and model-free reinforcement learning, transfer in the context of model-free reinforcement learning based on hierarchical Bayesian framework, probabilistic planning based on monte-carlo tree search, and new algorithms for learning task hierarchies. The algorithms were empirically evaluated in real-time...
Topics: DTIC Archive, OREGON STATE UNIV CORVALLIS, *BAYES THEOREM, *LEARNING, *PLANNING, ALGORITHMS,...
We analyze the Swift/BAT sample of short gamma-ray bursts, using an objective Bayesian Block procedure to extract temporal descriptors of the bursts' initial pulse complexes (IPCs). The sample comprises 12 and 41 bursts with and without extended emission (EE) components, respectively. IPCs of non-EE bursts are dominated by single pulse structures, while EE bursts tend to have two or more pulse structures. The medians of characteristic timescales - durations, pulse structure widths, and peak...
Topics: NASA Technical Reports Server (NTRS), GAMMA RAY BURSTS, HETEROGENEITY, PULSE DURATION, GAMMA RAY...
216
216
May 29, 2011
05/11
by
Wang, Xiao Yen; Chang, Sin-Chung; Jorgenson, Philip C. E
texts
eye 216
favorite 0
comment 0
The space-time conservation element and solution element(CE/SE) method is used to study the sound-shock interaction problem. The order of accuracy of numerical schemes is investigated. The linear model problem.govemed by the 1-D scalar convection equation, sound-shock interaction problem governed by the 1-D Euler equations, and the 1-D shock-tube problem which involves moving shock waves and contact surfaces are solved to investigate the order of accuracy of numerical schemes. It is concluded...
Topics: STRUCTURAL DESIGN, FAULT DETECTION, ERROR ANALYSIS, AIRCRAFT STRUCTURES, PROBABILITY THEORY,...
An explicit expression for a compression matrix T of smallest possible left dimension K consistent with preserving the n variate normal Bayes assignment of X to a given one of a finite number of populations and the K variate Bayes assignment of TX to that population was developed. The Bayes population assignment of X and TX were shown to be equivalent for a compression matrix T explicitly calculated as a function of the means and covariances of the given populations.
Topics: NASA Technical Reports Server (NTRS), BAYES THEOREM, DIMENSIONAL ANALYSIS, STATISTICAL ANALYSIS,...
This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also...
Topics: NASA Technical Reports Server (NTRS), MACHINE LEARNING, DATA PROCESSING EQUIPMENT, SIGNAL...
Nonparametric estimation of the hazard rate or failure rate is a frequent topic of investigation in the statistical literature because of its practical importance. Until quite recently, hazard rate estimation had been based on complete samples of independent identically distributed lifetimes. However, observations may be censored or truncated in many life testing situations. This occurs often in medical trials when the patients may enter treatment at different times and then either die from the...
Topics: DTIC Archive, McNichols,D T, SOUTH CAROLINA UNIV COLUMBIA DEPT OF MATHEMATICS AND STATISTICS,...
We develop an efficient, Bayesian Uncertainty Quantification framework us- ing a novel treed Gaussian process model. The tree is adaptively constructed using information conveyed by the observed data about the length scales of the underlying process. On each leaf of the tree, we utilize Bayesian Experimental Design techniques in order to learn a multi-output Gaussian process. The constructed surrogate can provide analytical point estimates, as well as error bars, for the statistics of interest....
Topics: DTIC Archive, CORNELL UNIV ITHACA NY MATERIALS PROCESS DESIGN AND CONTROL LABORATORY (MPDC), *BAYES...
This document contains a listing of 42 technical reports on a wide variety of topics in applied statistics and applied probability that were developed under this contract. In addition, a number of technical memoranda were developed in response to technical queries from defense department agencies, principally the National Security Agency. A listing of memoranda on 20 topics is included.
Topics: DTIC Archive, Solomon, Herbert, STANFORD UNIV CA DEPT OF STATISTICS, *PROBABILITY, *MILITARY...
Information warfare (IW) has developed into a significant threat to the national security of the United States. Our critieal infrastructures, linked together by information systems, are increasingly vulnerable to information attack. This study seeks to understand some of those factors which affect the ability of an individual to make accurate decisions in an IW environment. The study used game theory to analyze the behavior of decision-makers within an IW simulation. The IW game model is based...
Topics: DTIC Archive, Tait, Steven W, AIR FORCE INST OF TECH WRIGHT-PATTERSONAFB OH, *NATIONAL SECURITY,...
The conventional approach to GMTI uses narrowband signals and a short coherent processing interval (CPI). In this talk, we examine some of the fundamental theoretical issues involved in GMTI with wideband signals and long CPIs (WL-GMTI). The possibility of wideband long CPI GMTI has received some attention in recent years and there are a number of potential benefits: 1) Improved minimum detectable velocity (MDV). 2) Detection of targets with zero radial velocity (but non-zero tangential...
Topics: DTIC Archive, Yegulalp, Ali, MASSACHUSETTS INST OF TECH LEXINGTON LINCOLN LAB, *SYNTHETIC APERTURE...
In this paper we describe a software framework to enable heterogeneous, distributed data fusion of disparate information sources. The framework is agent-based and consists of three main elements. The first is a generalization of the target state to a container of arbitrary, uncertain attributes. The structure of this estimate can vary both across time and across different nodes in the same network. The second is the development of composable process and observation models. These make it...
Topics: DTIC Archive, ITT INDUSTRIES AND NAVAL RESEARCH LAB WASHINGTON DC, *DATA FUSION, REPRINTS,...
The multilayer perceptron was extensively analyzed. A technique for analyzing the multilayer perceptron, the saliency measure, was developed which provides a measure of the importance of inputs. The method was compared to the conventional statistical technique of best features and shown to provide similar rankings of the input. Using the saliency measure, it is shown that the multilayer perceptron effectively ignores useless inputs and that whether it is trained using backpropagation or...
Topics: DTIC Archive, Ruck, Dennis W, AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOOL OF ENGINEERING,...
This report is organized into seven sections. The first deals with issues raised by the effect of time on discrete event modeling, and is known as the frame problem. The second section describes a classical control theory approach to multiple target tracking, which relies exclusively upon Bayes theorem and a Kalman filter to track objects which change spatial position over time. The third section on justification-based truth maintenance systems introduces the concept of retraction to correct...
Topics: DTIC Archive, Cronin, T, ARMY CECOM SIGNALS WARFARE DIRECTORATE VINT HILL FARMS STATION VA, *DATA...
This dissertation focuses on sequential techniques for detecting a change, or disorder, in the statistics of a random process. First, the minimax robust quickest detector is derived for the case when the underlying noise models are only partially known. It is shown that when the robust processor is used, the minimax asymptotic performance measure is equal to the Kullback-Leibler divergence, and that the least favorable densities are those that minimize this quantity. The robust quickest...
Topics: DTIC Archive, Crow, R. W., PRINCETON UNIV NJ, *MATHEMATICAL MODELS, *MULTIVARIATE ANALYSIS,...