Results 1  10
of
252
Logit models and logistic regressions for social networks: I. an introduction to markov graphs and p
 Psychometrika
, 1996
"... ..."
A Guide to the Literature on Learning Probabilistic Networks From Data
, 1996
"... This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the ..."
Abstract

Cited by 172 (0 self)
 Add to MetaCart
This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the different methodological communities, such as Bayesian, description length, and classical statistics. Basic concepts for learning and Bayesian networks are introduced and methods are then reviewed. Methods are discussed for learning parameters of a probabilistic network, for learning the structure, and for learning hidden variables. The presentation avoids formal definitions and theorems, as these are plentiful in the literature, and instead illustrates key concepts with simplified examples. Keywords Bayesian networks, graphical models, hidden variables, learning, learning structure, probabilistic networks, knowledge discovery. I. Introduction Probabilistic networks or probabilistic gra...
Modelbased Geostatistics
 Applied Statistics
, 1998
"... Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random v ..."
Abstract

Cited by 96 (4 self)
 Add to MetaCart
Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random variables. We describe two spatial applications for which Gaussian distributional assumptions are clearly inappropriate. The first concerns the assessment of residual contamination from nuclear weapons testing on a South Pacific island, in which the sampling method generates spatially indexed Poisson counts conditional on an unobserved spatially varying intensity of radioactivity; we conclude that a coventional geostatistical analysis oversmooths the data and underestimates the spatial extremes of the intensity. The second application provides a description of spatial variation in the risk of campylobacter infections relative to other enteric infections in part of North Lancashire and South C...
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Bayesian and Regularization Methods for Hyperparameter Estimation in Image Restoration
 IEEE Trans. Image Processing
, 1999
"... In this paper, we propose the application of the hierarchical Bayesian paradigm to the image restoration problem. We derive expressions for the iterative evaluation of the two hyperparameters applying the evidence and maximum a posteriori (MAP) analysis within the hierarchical Bayesian paradigm. We ..."
Abstract

Cited by 65 (26 self)
 Add to MetaCart
In this paper, we propose the application of the hierarchical Bayesian paradigm to the image restoration problem. We derive expressions for the iterative evaluation of the two hyperparameters applying the evidence and maximum a posteriori (MAP) analysis within the hierarchical Bayesian paradigm. We show analytically that the analysis provided by the evidence approach is more realistic and appropriate than the MAP approach for the image restoration problem. We furthermore study the relationship between the evidence and an iterative approach resulting from the set theoretic regularization approach for estimating the two hyperparameters, or their ratio, defined as the regularization parameter. Finally the proposed algorithms are tested experimentally.
Spatial Econometrics
 PALGRAVE HANDBOOK OF ECONOMETRICS: VOLUME 1, ECONOMETRIC THEORY
, 2001
"... Spatial econometric methods deal with the incorporation of spatial interaction and spatial structure into regression analysis. The field has seen a recent and rapid growth spurred both by theoretical concerns as well as by the need to be able to apply econometric models to emerging large geocoded da ..."
Abstract

Cited by 64 (5 self)
 Add to MetaCart
Spatial econometric methods deal with the incorporation of spatial interaction and spatial structure into regression analysis. The field has seen a recent and rapid growth spurred both by theoretical concerns as well as by the need to be able to apply econometric models to emerging large geocoded data bases. The review presented in this chapter outlines the basic terminology and discusses in some detail the specification of spatial effects, estimation of spatial regression models, and specification tests for spatial effects.
Multiresolution optimal interpolation and statistical analysis of TOPEX/POSEIDON satellite altimetry
 IEEE Trans. Geosci. Remote Sensing
, 1995
"... Abstruct A recently developed multiresolution estimation framework offers the possibility of highly efficient statistical analysis, interpolation, and smoothing of extremely large data sets in a multiscale fashion. This framework enjoys a number of advantages not shared by other statisticallybased ..."
Abstract

Cited by 58 (33 self)
 Add to MetaCart
Abstruct A recently developed multiresolution estimation framework offers the possibility of highly efficient statistical analysis, interpolation, and smoothing of extremely large data sets in a multiscale fashion. This framework enjoys a number of advantages not shared by other statisticallybased methods. In particular, the algorithms resulting from this framework have complexity that scales only linearly with problem size, yielding constant complexity load per grid point independent of problem size. Furthermore these algorithms directly provide interpolated estimates at multiple resolutions, accompanying error variance statistics of use in assessing resolutionlaccuracy tradeoffs and in detecting statistically significant anomalies, and maximum likelihood estimates of parameters such as spectral power law coefficients. Moreover, the efficiency of these algorithms is completely insensitive to irregularities in the sampling or spatial distribution of measurements and to heterogeneities in measurement errors or model parameters. For these reasons this approach has the potential of being an effective tool in a variety of remote sensing problems. In this paper, we demonstrate a realization of this potential by applying the multiresolution framework to a problem of considerable current interestthe interpolation and statistical analysis of ocean surface data from the TOPEXPOSEIDON altimeter. I.
Density biased sampling: an improved method for data mining and clustering
 Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pp.82–92, 2000
"... Data mining in large data sets often requires a sampling or summarization step to form an incore representation of the data that can be processed more efficiently. Uniform random sampling is frequently used in practice and also frequently criticized because it will miss small clusters. Many natural ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
Data mining in large data sets often requires a sampling or summarization step to form an incore representation of the data that can be processed more efficiently. Uniform random sampling is frequently used in practice and also frequently criticized because it will miss small clusters. Many natural phenomena are known to follow Zipf’s distribution and the inability of uniform sampling to find small clusters is of practical concern. Density Biased Sampling is proposed to probabilistically undersample dense regions and oversample light regions. A weighted sample is used to preserve the densities of the original data. Density biased sampling naturally includes uniform sampling as a special case. A memory efficient algorithm is proposed that approximates density biased sampling using only a single scan of the data. We empirically evaluate density biased sampling using synthetic data sets that exhibit varying cluster size distributions finding up to a factor of six improvement over uniform sampling. 1
Analysis and Decomposition of Spatial Variation in Integrated Circuit Processes and Devices
 IEEE Transactions on Semiconductor Manufacturing
, 1997
"... Variation is a key concern in semiconductor manufacturing and is manifest in several forms. Spatial variation across each wafer results from equipment or process limitations, and variation within each die may be exacerbated further by complex pattern dependencies. Spatial variation information is im ..."
Abstract

Cited by 54 (5 self)
 Add to MetaCart
Variation is a key concern in semiconductor manufacturing and is manifest in several forms. Spatial variation across each wafer results from equipment or process limitations, and variation within each die may be exacerbated further by complex pattern dependencies. Spatial variation information is important not only for process optimization and control, but also for design of circuits that are robust to such variation. Systematic and random components of the variation must be identified, and models relating the spatial variation to specific process and pattern causes are needed. In this work, extraction and modeling methods are described for waferlevel, dielevel, and waferdie interaction contributions to spatial variation. Waferlevel estimation methods include filtering, spline, and regression based approaches. Dielevel (or intradie) variation can be extracted using spatial Fourier transform methods; important issues include spectral interpolation and sampling requirements. Finally, the interaction between wafer and dielevel effects is important to fully capture and separate systematic versus random variation; spline and frequencybased methods are proposed for this modeling. Together, these provide an effective collection of methods to identify and model spatial variation for future use in process control to reduce systematic variation, and in process/device design to produce more robust circuits.
Chain Graph Models and their Causal Interpretations
 B
, 2001
"... Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultim ..."
Abstract

Cited by 48 (4 self)
 Add to MetaCart
Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultimately fallacious interpretations of chain graphs that are often invoked, implicitly or explicitly. These interpretations also lead to awed methods for applying background knowledge to model selection. We present a valid interpretation by showing how the distribution corresponding to a chain graph may be generated as the equilibrium distribution of dynamic models with feedback. These dynamic interpretations lead to a simple theory of intervention, extending the theory developed for DAGs. Finally, we contrast chain graph models under this interpretation with simultaneous equation models which have traditionally been used to model feedback in econometrics. Keywords: Causal model; cha...