Results 1  10
of
21
Global Optimization of Statistical Functions with Simulated Annealing
 Journal of Econometrics
, 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract

Cited by 281 (2 self)
 Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simulated annealing, on four econometric problems and compare it to three common conventional algorithms. Not only can simulated annealing find the global optimum, it is also less likely to fail on difficult functions because it is a very robust algorithm. The promise of simulated annealing is demonstrated on the four econometric problems.
Multiscale MAP filtering of SAR images
 IEEE Transactions on Image Processing
"... Abstract—Synthetic aperture radar (SAR) images are disturbed by a multiplicative noise depending on the signal (the ground reflectivity) due to the radar wave coherence. Images have a strong variability from one pixel to another reducing essentially the efficiency of the algorithms of detection an ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
(Show Context)
Abstract—Synthetic aperture radar (SAR) images are disturbed by a multiplicative noise depending on the signal (the ground reflectivity) due to the radar wave coherence. Images have a strong variability from one pixel to another reducing essentially the efficiency of the algorithms of detection and classification. In this study, we propose to filter this noise with a multiresolution analysis of the image. The wavelet coefficient of the reflectivity is estimated with a Bayesian model, maximizing the a posteriori probability density function. The different probability density function are modeled with the Pearson system of distributions. The resulting filter combines the classical adaptive approach with wavelet decomposition where the local variance of highfrequency images is used in order to segment and filter wavelet coefficients. Index Terms—Adaptive filtering, synthetic aperture radar (SAR), speckle, wavelet. I.
Transparent Anonymization: Thwarting Adversaries Who Know the Algorithm
, 2010
"... Numerous generalization techniques have been proposed for privacypreserving data publishing. Most existing techniques, however, implicitly assume that the adversary knows little about the anonymization algorithm adopted by the data publisher. Consequently, they cannot guard against privacy attacks ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Numerous generalization techniques have been proposed for privacypreserving data publishing. Most existing techniques, however, implicitly assume that the adversary knows little about the anonymization algorithm adopted by the data publisher. Consequently, they cannot guard against privacy attacks that exploit various characteristics of the anonymization mechanism. This article provides a practical solution to this problem. First, we propose an analytical model for evaluating disclosure risks, when an adversary knows everything in the anonymization process, except the sensitive values. Based on this model, we develop a privacy principle, transparent ldiversity,which ensures privacy protection against such powerful adversaries. We identify three algorithms that achieve transparent ldiversity, and verify their effectiveness and efficiency through extensive experiments with real data.
CramérRao Bounds for 2D Target Shape Estimation in Nonlinear Inverse Scattering Problems with Application to Passive Radar
 IEEE Trans. on Antennas and Propagat
, 2001
"... We present new methods for computing fundamental performance limits for twodimensional (2D) parametric shape estimation in nonlinear inverse scattering problems with an application to passive radar imaging. We evaluate CramrRao lower bounds (CRB) on shape estimation accuracy using the domain der ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
We present new methods for computing fundamental performance limits for twodimensional (2D) parametric shape estimation in nonlinear inverse scattering problems with an application to passive radar imaging. We evaluate CramrRao lower bounds (CRB) on shape estimation accuracy using the domain derivative technique from nonlinear inverse scattering theory. The CRB provides an unbeatable performance limit for any unbiased estimator, and under fairly mild regularity conditions is asymptotically achieved by the maximum likelihood estimator (MLE). The resultant CRBs are used to define an asymptotic global confidence region, centered around the true boundary, in which the boundary estimate lies with a prescribed probability. These global confidence regions conveniently display the uncertainty in various geometric parameters such as shape, size, orientation, and position of the estimated target and facilitate geometric inferences. Numerical simulations are performed using the layer approach and the Nystrm method for computation of domain derivatives and using Fourier descriptors for target shape parameterization. This analysis demonstrates the accuracy and generality of the proposed methods. Index TermsCramrRao bounds, Fourier descriptors, global confidence regions, nonlinear inverse scattering, passive radar imaging, shape estimation. I.
Quantitative Statistical Assessment of Conditional Models for Synthetic Aperture Radar
 February 2004): 113
"... Abstract—Many applications of object recognition in the presence of pose uncertainty rely on statistical models—conditioned on pose—for observations. The image statistics of threedimensional (3D) objects are often assumed to belong to a family of distributions with unknown model parameters that va ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Many applications of object recognition in the presence of pose uncertainty rely on statistical models—conditioned on pose—for observations. The image statistics of threedimensional (3D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuousvalued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov–Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing posedependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodnessoffit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level. Index Terms—Conditional models, MSTAR, statistical hypothesis testing, statistical model assessment, synthetic aperture radar. I.
IBM SPSS Exact Tests
"... information under Notices on page 213. This edition applies to IBM ® SPSS ® Exact Tests 20 and to all subsequent releases and modifications until otherwise indicated in new editions. Microsoft product screenshots reproduced with permission from Microsoft Corporation. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
information under Notices on page 213. This edition applies to IBM ® SPSS ® Exact Tests 20 and to all subsequent releases and modifications until otherwise indicated in new editions. Microsoft product screenshots reproduced with permission from Microsoft Corporation.
Input Feature Extraction for Multilayered Perceptrons Using Supervised Principal Component Analysis
"... : A method is proposed for constructing salient features from a set of features that are given as input to a feedforward neural network used for supervised learning. Combinations of the original features are formed that maximize the sensitivity of the network's outputs with respect to variation ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
: A method is proposed for constructing salient features from a set of features that are given as input to a feedforward neural network used for supervised learning. Combinations of the original features are formed that maximize the sensitivity of the network's outputs with respect to variations of its inputs. The method exhibits some similarity to Principal Component Analysis, but also takes into account supervised character of the learning task. It is applied to classification problems leading to improved generalization ability originating from the alleviation of the curse of dimensionality problem. This paper has not been submitted elsewhere in identical or similar form, nor will it be during the first three months after its submission to Neural Processing Letters. 1 Input Feature Extraction for Multilayered Perceptrons Using Supervised Principal Component Analysis Key words: feature extraction, feature selection, multilayered perceptron, principal components, saliency Abstra...
A Comparative Analysis of the Bootstrap versus Traditional Statistical Procedures Applied to Digital Analysis Based on Benford’s Law
, 2010
"... Many fraud detection problems involve large numbers of financial transactions such as those associated with credit cards, accounts receivables, payments to vendors, payrolls, or other expense accounts (Panigrahi 2006; Bolton and Hand 2002). Computer Assisted Auditing Tools and Techniques (CAATTs) e. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Many fraud detection problems involve large numbers of financial transactions such as those associated with credit cards, accounts receivables, payments to vendors, payrolls, or other expense accounts (Panigrahi 2006; Bolton and Hand 2002). Computer Assisted Auditing Tools and Techniques (CAATTs) e.g., ACL (2006) allow auditors to perform digital analysis based on Benford’s Law (Benford 1938) for scrutinizing high volumes of complex financial data and detecting unintentional errors or fraud (AICPA 2008; Panigrahi 2006; Coderre 1999). However, the advantages that these software packages offer for assessing data conformity to Benford’s Law are limited due to problems associated with the underlying traditional statistical procedures. Specifically, previous studies (e.g., Cho and Gaines 2007; Geyer and Williamson 2004; Nigrini 2000) have issued caveats on the use of traditional statistical tests such as chisquare goodnessoffit or Ztests in the context of Benford’s Law because applications of these tests to large data sets may falsely lead auditors to believe that evidence of fraud exists when in fact there is none. Nigrini (2000) defines this situation as the problem of “excessive power ” (p.75). Similarly, Geyer and Williamson (2004) note that “…one has to be careful in such situations…where the sample size may be very large, for this [ Z] test is almost certain to reject
© PHOTO F/X2 HIGH QUALITY ENDTOEND LINK PERFORMANCE Adaptive Distributed MIMO Multihop Networks with Optimized Resource Allocation
"... Recently, there has been an increasing interest in applying traditional pointtopoint multipleinput multipleoutput (MIMO) techniques to multihop wireless relaying networks to support higher endtoend (e2e) data rates and to provide a better user experience [1]–[5]. By the concept of virtual anten ..."
Abstract
 Add to MetaCart
Recently, there has been an increasing interest in applying traditional pointtopoint multipleinput multipleoutput (MIMO) techniques to multihop wireless relaying networks to support higher endtoend (e2e) data rates and to provide a better user experience [1]–[5]. By the concept of virtual antenna array (VAA), spatially separated relaying nodes can utilize the capacity improvements offered by MIMO transmission techniques. Digital Object Identifier 10.1109/MVT.2009.933472 For example, the application of distributed spacetime codes was proven in [5] to significantly improve the data rate in multihop networks. Figure 1 depicts a distributed MIMO multihop network, where one source communicates with one destination via a number of relaying VAAs in multiple hops. Spatially adjacent nodes in a VAA receive data from the previous VAA and relay the message to the consecutive VAA until the destination is reached. The general concept of distributed MIMO multihop communication systems has been analyzed in [4] and [5],