Results 1  10
of
223
A Survey of Computational Complexity Results in Systems and Control
, 2000
"... The purpose of this paper is twofold: (a) to provide a tutorial introduction to some key concepts from the theory of computational complexity, highlighting their relevance to systems and control theory, and (b) to survey the relatively recent research activity lying at the interface between these fi ..."
Abstract

Cited by 188 (21 self)
 Add to MetaCart
The purpose of this paper is twofold: (a) to provide a tutorial introduction to some key concepts from the theory of computational complexity, highlighting their relevance to systems and control theory, and (b) to survey the relatively recent research activity lying at the interface between these fields. We begin with a brief introduction to models of computation, the concepts of undecidability, polynomial time algorithms, NPcompleteness, and the implications of intractability results. We then survey a number of problems that arise in systems and control theory, some of them classical, some of them related to current research. We discuss them from the point of view of computational complexity and also point out many open problems. In particular, we consider problems related to stability or stabilizability of linear systems with parametric uncertainty, robust control, timevarying linear systems, nonlinear and hybrid systems, and stochastic optimal control.
Computing Variance for Interval Data is NPHard
, 2002
"... When we have only interval ranges [x i ; x i ] of sample values x 1 ; : : : ; xn , what is the interval [V ; V ] of possible values for the variance V of these values? We prove that the problem of computing the upper bound V is NPhard. We provide a feasible (quadratic time) algorithm for computi ..."
Abstract

Cited by 69 (50 self)
 Add to MetaCart
When we have only interval ranges [x i ; x i ] of sample values x 1 ; : : : ; xn , what is the interval [V ; V ] of possible values for the variance V of these values? We prove that the problem of computing the upper bound V is NPhard. We provide a feasible (quadratic time) algorithm for computing the lower bound V on the variance of interval data. We also provide a feasible algorithm that computes V under reasonable easily verifiable conditions.
Towards combining probabilistic and interval uncertainty in engineering calculations: algorithms for computing statistics under interval uncertainty, and their computational complexity
 Reliable Computing
, 2006
"... Abstract. In many engineering applications, we have to combine probabilistic and interval uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, ..."
Abstract

Cited by 47 (45 self)
 Add to MetaCart
(Show Context)
Abstract. In many engineering applications, we have to combine probabilistic and interval uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only measure the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data. In this paper, we provide a survey of algorithms for computing various statistics under interval uncertainty and their computational complexity. The survey includes both known and new algorithms.
Experimental Uncertainty Estimation and Statistics for Data Having Interval Uncertainty
 11733, SAND20070939. hal00839639, version 1  28 Jun 2013
"... Sandia is a multiprogram laboratory operated by Sandia Corporation, ..."
Abstract

Cited by 40 (21 self)
 Add to MetaCart
(Show Context)
Sandia is a multiprogram laboratory operated by Sandia Corporation,
A New CauchyBased BlackBox Technique for Uncertainty in Risk Analysis
 in Risk Analysis, Reliability Engineering and Systems Safety
, 2002
"... Uncertainty is very important in risk analysis. A natural way to describe this uncertainty is to describe a set of possible values of each unknown quantity (this set is usually an interval), plus any additional information that we may have about the probability of different values within this set. T ..."
Abstract

Cited by 37 (18 self)
 Add to MetaCart
(Show Context)
Uncertainty is very important in risk analysis. A natural way to describe this uncertainty is to describe a set of possible values of each unknown quantity (this set is usually an interval), plus any additional information that we may have about the probability of different values within this set. Traditional statistical techniques deal with the situations in which we have a complete information about the probabilities; in real life, however, we often have only partial information about them. We therefore need to describe methods of handling such partial information in risk analysis. Several such techniques have been presented, often on a heuristic basis. The main goal of this paper is to provide a justification for a general formalism for handling different types of uncertainty, and to describe a new blackbox technique for processing this type of uncertainty.
Error Estimations For Indirect Measurements: Randomized Vs. Deterministic Algorithms For "BlackBox" Programs
 Handbook on Randomized Computing, Kluwer, 2001
, 2000
"... In many reallife situations, it is very difficult or even impossible to directly measure the quantity y in which we are interested: e.g., we cannot directly measure a distance to a distant galaxy or the amount of oil in a given well. Since we cannot measure such quantities directly, we can measure ..."
Abstract

Cited by 33 (15 self)
 Add to MetaCart
(Show Context)
In many reallife situations, it is very difficult or even impossible to directly measure the quantity y in which we are interested: e.g., we cannot directly measure a distance to a distant galaxy or the amount of oil in a given well. Since we cannot measure such quantities directly, we can measure them indirectly: by first measuring some relating quantities x1 ; : : : ; xn , and then by using the known relation between x i and y to reconstruct the value of the desired quantity y. In practice, it is often very important to estimate the error of the resulting indirect measurement. In this paper, we describe and compare different deterministic and randomized algorithms for solving this problem in the situation when a program for transforming the estimates e x1 ; : : : ; e xn for x i into an estimate for y is only available as a black box (with no source code at hand). We consider this problem in two settings: statistical, when measurements errors \Deltax i = e x i \Gamma x i are inde...
Efficient and safe global constraints for handling numerical constraint systems
 SIAM J. NUMER. ANAL
, 2005
"... Numerical constraint systems are often handled by branch and prune algorithms that combine splitting techniques, local consistencies, and interval methods. This paper first recalls the principles of Quad, a global constraint that works on a tight and safe linear relaxation of quadratic subsystems ..."
Abstract

Cited by 27 (9 self)
 Add to MetaCart
(Show Context)
Numerical constraint systems are often handled by branch and prune algorithms that combine splitting techniques, local consistencies, and interval methods. This paper first recalls the principles of Quad, a global constraint that works on a tight and safe linear relaxation of quadratic subsystems of constraints. Then, it introduces a generalization of Quad to polynomial constraint systems. It also introduces a method to get safe linear relaxations and shows how to compute safe bounds of the variables of the linear constraint system. Different linearization techniques are investigated to limit the number of generated constraints. QuadSolver, a new branch and prune algorithm that combines Quad, local consistencies, and interval methods, is introduced. QuadSolver has been evaluated on a variety of benchmarks from kinematics, mechanics, and robotics. On these benchmarks, it outperforms classical interval methods as well as constraint satisfaction problem solvers and it compares well with stateoftheart optimization solvers.
New methods for splice site recognition
, 2002
"... Splice sites are locations in DNA which separate proteincoding regions (exons) from noncoding regions (introns). Accurate splice site detectors thus form important components of computational gene finders. We pose splice site recognition as a classification problem with the classifier learnt from ..."
Abstract

Cited by 24 (4 self)
 Add to MetaCart
Splice sites are locations in DNA which separate proteincoding regions (exons) from noncoding regions (introns). Accurate splice site detectors thus form important components of computational gene finders. We pose splice site recognition as a classification problem with the classifier learnt from a labeled data set consisting of only local information around the potential splice site. Note that finding the correct position of splice sites without using global information is a rather hard task. We analyze the genomes of the nematode Caenorhabditis elegans and of humans using specially designed support vector kernels. One of the kernels is adapted from our previous work on detecting translation initiation sites in vertebrates and another uses an extension to the wellknown Fisherkernel. We find excellent performance on both data sets.
NonDestructive Testing of Aerospace Structures: Granularity and Data Mining Approach
 PROCEEDINGS OF FUZZIEEE'2002
, 2002
"... For large aerospace structures, it is extremely important to detect faults, and nondestructive testing is the only practical way to do it. Based on measurements of ultrasonic waves, Eddy currents, magnetic resonance, etc., we reconstruct the locations of the faults. The best (most efficient) known s ..."
Abstract

Cited by 24 (20 self)
 Add to MetaCart
For large aerospace structures, it is extremely important to detect faults, and nondestructive testing is the only practical way to do it. Based on measurements of ultrasonic waves, Eddy currents, magnetic resonance, etc., we reconstruct the locations of the faults. The best (most efficient) known statistical methods for fault reconstruction are not perfect. We show that the use of expert knowledgebased granulation improves the quality of fault reconstruction.
Outlier Detection Under Interval Uncertainty: Algorithmic Solvability and Computational Complexity
 LargeScale Scientific Computing, Proceedings of the 4th International Conference LSSC’2003, Sozopol, Bulgaria, June 4–8, 2003, Springer Lecture Notes in Computer Science
"... In many application areas, it is important to detect outliers. The traditional engineering approach to outlier detection is that we start with some “normal ” values x1,..., xn, compute the sample average E, the sample standard variation σ, and then mark a value x as an outlier if x is outside the k0 ..."
Abstract

Cited by 22 (13 self)
 Add to MetaCart
(Show Context)
In many application areas, it is important to detect outliers. The traditional engineering approach to outlier detection is that we start with some “normal ” values x1,..., xn, compute the sample average E, the sample standard variation σ, and then mark a value x as an outlier if x is outside the k0sigma interval [E − k0 · σ,E + k0 · σ] (for some preselected parameter k0). In real life, we often have only interval ranges [xi, xi] for the normal values x1,..., xn. In this case, we only have intervals of