Results 1  10
of
2,445,275
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances â€” including the key problems of computing marginals and modes
Approximate Signal Processing
, 1997
"... It is increasingly important to structure signal processing algorithms and systems to allow for trading off between the accuracy of results and the utilization of resources in their implementation. In any particular context, there are typically a variety of heuristic approaches to managing these tra ..."
Abstract

Cited by 516 (2 self)
 Add to MetaCart
It is increasingly important to structure signal processing algorithms and systems to allow for trading off between the accuracy of results and the utilization of resources in their implementation. In any particular context, there are typically a variety of heuristic approaches to managing
NonUniform Random Variate Generation
, 1986
"... This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various algorith ..."
Abstract

Cited by 1006 (25 self)
 Add to MetaCart
This is a survey of the main methods in nonuniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various
A Sense of Self for Unix Processes
 In Proceedings of the 1996 IEEE Symposium on Security and Privacy
, 1996
"... A method for anomaly detection is introduced in which "normal" is defined by shortrange correlations in a process ' system calls. Initial experiments suggest that the definition is stable during normal behavior for standard UNIX programs. Further, it is able to detect several common ..."
Abstract

Cited by 684 (29 self)
 Add to MetaCart
A method for anomaly detection is introduced in which "normal" is defined by shortrange correlations in a process ' system calls. Initial experiments suggest that the definition is stable during normal behavior for standard UNIX programs. Further, it is able to detect several common
Controlled and automatic human information processing
 I. Detection, search, and attention. Psychological Review
, 1977
"... A twoprocess theory of human information processing is proposed and applied to detection, search, and attention phenomena. Automatic processing is activation of a learned sequence of elements in longterm memory that is initiated by appropriate inputs and then proceeds automaticallyâ€”without subjec ..."
Abstract

Cited by 841 (15 self)
 Add to MetaCart
A twoprocess theory of human information processing is proposed and applied to detection, search, and attention phenomena. Automatic processing is activation of a learned sequence of elements in longterm memory that is initiated by appropriate inputs and then proceeds automatically
An introduction to variational methods for graphical models
 TO APPEAR: M. I. JORDAN, (ED.), LEARNING IN GRAPHICAL MODELS
"... ..."
A Signal Processing Approach To Fair Surface Design
, 1995
"... In this paper we describe a new tool for interactive freeform fair surface design. By generalizing classical discrete Fourier analysis to twodimensional discrete surface signals  functions defined on polyhedral surfaces of arbitrary topology , we reduce the problem of surface smoothing, or fai ..."
Abstract

Cited by 668 (15 self)
 Add to MetaCart
In this paper we describe a new tool for interactive freeform fair surface design. By generalizing classical discrete Fourier analysis to twodimensional discrete surface signals  functions defined on polyhedral surfaces of arbitrary topology , we reduce the problem of surface smoothing, or fairing, to lowpass filtering. We describe a very simple surface signal lowpass filter algorithm that applies to surfaces of arbitrary topology. As opposed to other existing optimizationbased fairing methods, which are computationally more expensive, this is a linear time and space complexity algorithm. With this algorithm, fairing very large surfaces, such as those obtained from volumetric medical data, becomes affordable. By combining this algorithm with surface subdivision methods we obtain a very effective fair surface design technique. We then extend the analysis, and modify the algorithm accordingly, to accommodate different types of constraints. Some constraints can be imposed without any modification of the algorithm, while others require the solution of a small associated linear system of equations. In particular, vertex location constraints, vertex normal constraints, and surface normal discontinuities across curves embedded in the surface, can be imposed with this technique. CR Categories and Subject Descriptors: I.3.3 [Computer Graphics]: Picture/image generation  display algorithms; I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling  curve, surface, solid, and object representations;J.6[Com puter Applications]: ComputerAided Engineering  computeraided design General Terms: Algorithms, Graphics. 1
Singularity Detection And Processing With Wavelets
 IEEE Transactions on Information Theory
, 1992
"... Most of a signal information is often found in irregular structures and transient phenomena. We review the mathematical characterization of singularities with Lipschitz exponents. The main theorems that estimate local Lipschitz exponents of functions, from the evolution across scales of their wavele ..."
Abstract

Cited by 590 (13 self)
 Add to MetaCart
Most of a signal information is often found in irregular structures and transient phenomena. We review the mathematical characterization of singularities with Lipschitz exponents. The main theorems that estimate local Lipschitz exponents of functions, from the evolution across scales of their wavelet transform are explained. We then prove that the local maxima of a wavelet transform detect the location of irregular structures and provide numerical procedures to compute their Lipschitz exponents. The wavelet transform of singularities with fast oscillations have a different behavior that we study separately. We show that the size of the oscillations can be measured from the wavelet transform local maxima. It has been shown that one and twodimensional signals can be reconstructed from the local maxima of their wavelet transform [14]. As an application, we develop an algorithm that removes white noises by discriminating the noise and the signal singularities through an analysis of their ...
Static Scheduling of Synchronous Data Flow Programs for Digital Signal Processing
 IEEE TRANSACTIONS ON COMPUTERS
, 1987
"... Large grain data flow (LGDF) programming is natural and convenient for describing digital signal processing (DSP) systems, but its runtime overhead is costly in real time or costsensitive applications. In some situations, designers are not willing to squander computing resources for the sake of pro ..."
Abstract

Cited by 592 (37 self)
 Add to MetaCart
Large grain data flow (LGDF) programming is natural and convenient for describing digital signal processing (DSP) systems, but its runtime overhead is costly in real time or costsensitive applications. In some situations, designers are not willing to squander computing resources for the sake
Modeling Strategic Relationships for Process Reengineering
, 1995
"... Existing models for describing a process (such as a business process or a software development process) tend to focus on the \what " or the \how " of the process. For example, a health insurance claim process would typically be described in terms of a number of steps for assessing and appr ..."
Abstract

Cited by 545 (40 self)
 Add to MetaCart
Existing models for describing a process (such as a business process or a software development process) tend to focus on the \what " or the \how " of the process. For example, a health insurance claim process would typically be described in terms of a number of steps for assessing
Results 1  10
of
2,445,275