Results 1  10
of
1,405
Buenos Aires, Argentina Precise Robustness Analysis of Time Petri Nets with Inhibitor Arcs
, 2013
"... Context: Verifying Complex Timed Systems Need for early bug detection Bugs discovered when nal testing: expensive ..."
Abstract
 Add to MetaCart
Context: Verifying Complex Timed Systems Need for early bug detection Bugs discovered when nal testing: expensive
Measuring the information content of stock trades
 Journal of Finance
, 1991
"... This paper suggests that the interactions of security trades and quote revisions be modeled as a vector autoregressive system. Within this framework, a trade's information effect may be meaningfully measured as the ultimate price impact of the trade innovation. Estimates for a sample of NYSE is ..."
Abstract

Cited by 469 (11 self)
 Add to MetaCart
significant for smaller firms. CENTRALTO THE ANALYSIS of market microstructure is the notion that in a market with asymmetrically informed agents, trades convey information and therefore cause a persistent impact on the security price. The magnitude of the price effect for a given trade size is generally held
Robust Classification for Imprecise Environments
, 1989
"... In realworld environments it is usually difficult to specify target operating conditions precisely. This uncertainty makes building robust classification systems problematic. We present a method for the comparison of classifier performance that is robust to imprecise class distributions and misclas ..."
Abstract

Cited by 341 (15 self)
 Add to MetaCart
In realworld environments it is usually difficult to specify target operating conditions precisely. This uncertainty makes building robust classification systems problematic. We present a method for the comparison of classifier performance that is robust to imprecise class distributions
Iterative hard thresholding for compressed sensing
 Appl. Comp. Harm. Anal
"... Compressed sensing is a technique to sample compressible signals below the Nyquist rate, whilst still allowing near optimal reconstruction of the signal. In this paper we present a theoretical analysis of the iterative hard thresholding algorithm when applied to the compressed sensing recovery probl ..."
Abstract

Cited by 329 (18 self)
 Add to MetaCart
problem. We show that the algorithm has the following properties (made more precise in the main text of the paper) • It gives nearoptimal error guarantees. • It is robust to observation noise. • It succeeds with a minimum number of observations. • It can be used with any sampling operator for which
Bounded geometries, fractals, and lowdistortion embeddings
"... The doubling constant of a metric space (X; d) is thesmallest value * such that every ball in X can be covered by * balls of half the radius. The doubling dimension of X isthen defined as dim(X) = log2 *. A metric (or sequence ofmetrics) is called doubling precisely when its doubling dimension is ..."
Abstract

Cited by 198 (40 self)
 Add to MetaCart
The doubling constant of a metric space (X; d) is thesmallest value * such that every ball in X can be covered by * balls of half the radius. The doubling dimension of X isthen defined as dim(X) = log2 *. A metric (or sequence ofmetrics) is called doubling precisely when its doubling dimension
Robustness Analysis of Finite Precision Implementations
"... Abstract. A desirable property of control systems is robustness to inputs, when small perturbations of the inputs of a system will cause only small perturbations on outputs. This property should be maintained at the implementation level, where close inputs can lead to different execution paths. The ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
rely on the stable test hypothesis, yielding unsound error bounds when the conditional block is not robust to uncertainties. We propose a new abstractinterpretation based error analysis of finite precision implementations, which is sound in presence of unstable tests, by bounding the discontinuity
On kanonymity and the curse of dimensionality
 In VLDB
, 2005
"... In recent years, the wide availability of personal data has made the problem of privacy preserving data mining an important one. A number of methods have recently been proposed for privacy preserving data mining of multidimensional data records. One of the methods for privacy preserving data mining ..."
Abstract

Cited by 171 (4 self)
 Add to MetaCart
is that of anonymization, in which a record is released only if it is indistinguishable from k other entities in the data. We note that methods such as kanonymity are highly dependent upon spatial locality in order to effectively implement the technique in a statistically robust way. In high dimensional space the data
Whom You Know Matters: Venture Capital Networks and Investment Performance,
 Journal of Finance
, 2007
"... Abstract Many financial markets are characterized by strong relationships and networks, rather than arm'slength, spotmarket transactions. We examine the performance consequences of this organizational choice in the context of relationships established when VCs syndicate portfolio company inv ..."
Abstract

Cited by 138 (8 self)
 Add to MetaCart
centrality measure by dividing by the maximum possible degree in an nactor network (i.e., n1). While we normalize the centrality measures used in the empirical analysis, we note that all our results are robust to using nonnormalized network centrality measures instead. 8 B. Closeness While degree counts
Precise robustness analysis of time Petri nets with inhibitor arcs
 In FORMATS, volume 8053 of LNCS
, 2013
"... Abstract. Quantifying the robustness of a realtime system consists in measuring the maximum extension of the timing delays such that the system still satisfies its specification. In this work, we introduce a more precise notion of robustness, measuring the allowed variability of the timing delays i ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. Quantifying the robustness of a realtime system consists in measuring the maximum extension of the timing delays such that the system still satisfies its specification. In this work, we introduce a more precise notion of robustness, measuring the allowed variability of the timing delays
Robustness and regularization of support vector machines
, 1485
"... We consider regularized support vector machines (SVMs) and show that they are precisely equivalent to a new robust optimization formulation. We show that this equivalence of robust optimization and regularization has implications for both algorithms, and analysis. In terms of algorithms, the equival ..."
Abstract

Cited by 45 (7 self)
 Add to MetaCart
We consider regularized support vector machines (SVMs) and show that they are precisely equivalent to a new robust optimization formulation. We show that this equivalence of robust optimization and regularization has implications for both algorithms, and analysis. In terms of algorithms
Results 1  10
of
1,405