Results 1  10
of
2,725,061
Markov chain sampling methods for Dirichlet process mixture models
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2000
"... ..."
Incorporating nonlocal information into information extraction systems by Gibbs sampling
 IN ACL
, 2005
"... Most current statistical natural language processing models use only local features so as to permit dynamic programming in inference, but this makes them unable to fully account for the long distance structure that is prevalent in language use. We show how to solve this dilemma with Gibbs sampling, ..."
Abstract

Cited by 725 (25 self)
 Add to MetaCart
Most current statistical natural language processing models use only local features so as to permit dynamic programming in inference, but this makes them unable to fully account for the long distance structure that is prevalent in language use. We show how to solve this dilemma with Gibbs sampling
Tinydb: An acquisitional query processing system for sensor networks
 ACM Trans. Database Syst
, 2005
"... We discuss the design of an acquisitional query processor for data collection in sensor networks. Acquisitional issues are those that pertain to where, when, and how often data is physically acquired (sampled) and delivered to query processing operators. By focusing on the locations and costs of acq ..."
Abstract

Cited by 625 (8 self)
 Add to MetaCart
We discuss the design of an acquisitional query processor for data collection in sensor networks. Acquisitional issues are those that pertain to where, when, and how often data is physically acquired (sampled) and delivered to query processing operators. By focusing on the locations and costs
Static Scheduling of Synchronous Data Flow Programs for Digital Signal Processing
 IEEE TRANSACTIONS ON COMPUTERS
, 1987
"... Large grain data flow (LGDF) programming is natural and convenient for describing digital signal processing (DSP) systems, but its runtime overhead is costly in real time or costsensitive applications. In some situations, designers are not willing to squander computing resources for the sake of pro ..."
Abstract

Cited by 599 (37 self)
 Add to MetaCart
flow (SDF) differs from traditional data flow in that the amount of data produced and consumed by a data flow node is specified a priori for each input and output. This is equivalent to specifying the relative sample rates in signal processing system. This means that the scheduling of SDF nodes need
Some methods for classification and analysis of multivariate observations
 In 5th Berkeley Symposium on Mathematical Statistics and Probability
, 1967
"... The main purpose of this paper is to describe a process for partitioning an Ndimensional population into k sets on the basis of a sample. The process, which is called 'kmeans, ' appears to give partitions which are reasonably ..."
Abstract

Cited by 3054 (3 self)
 Add to MetaCart
The main purpose of this paper is to describe a process for partitioning an Ndimensional population into k sets on the basis of a sample. The process, which is called 'kmeans, ' appears to give partitions which are reasonably
On the Use of Windows for Harmonic Analysis With the Discrete Fourier Transform
 Proc. IEEE
, 1978
"... AhmwThis Pw!r mak = available a concise review of data win compromise consists of applying windows to the sampled daws pad the ^ affect On the Of in the data set, or equivalently, smoothing the spectral samples. '7 of aoise9 m the ptesence of sdroag bar The two operations to which we subject ..."
Abstract

Cited by 668 (0 self)
 Add to MetaCart
, windowing is less related to sampled windows for DFT's. HERE IS MUCH signal processing devoted to detection and estimation. Detection is the task of determiningif a specific signal set is present in an observation, while estimation is the task of obtaining the values of the parameters
On the Resemblance and Containment of Documents
 In Compression and Complexity of Sequences (SEQUENCESâ€™97
, 1997
"... Given two documents A and B we define two mathematical notions: their resemblance r(A, B)andtheircontainment c(A, B) that seem to capture well the informal notions of "roughly the same" and "roughly contained." The basic idea is to reduce these issues to set intersection probl ..."
Abstract

Cited by 504 (6 self)
 Add to MetaCart
problems that can be easily evaluated by a process of random sampling that can be done independently for each document. Furthermore, the resemblance can be evaluated using a fixed size sample for each document.
A Sequential Algorithm for Training Text Classifiers
, 1994
"... The ability to cheaply train text classifiers is critical to their use in information retrieval, content analysis, natural language processing, and other tasks involving data which is partly or fully textual. An algorithm for sequential sampling during machine learning of statistical classifiers was ..."
Abstract

Cited by 630 (10 self)
 Add to MetaCart
The ability to cheaply train text classifiers is critical to their use in information retrieval, content analysis, natural language processing, and other tasks involving data which is partly or fully textual. An algorithm for sequential sampling during machine learning of statistical classifiers
The Lumigraph
 IN PROCEEDINGS OF SIGGRAPH 96
, 1996
"... This paper discusses a new method for capturing the complete appearanceof both synthetic and real world objects and scenes, representing this information, and then using this representation to render images of the object from new camera positions. Unlike the shape capture process traditionally used ..."
Abstract

Cited by 1025 (39 self)
 Add to MetaCart
in computer vision and the rendering process traditionally used in computer graphics, our approach does not rely on geometric representations. Instead we sample and reconstruct a 4D function, which we call a Lumigraph. The Lumigraph is a subset of the complete plenoptic function that describes the flow
A Metrics Suite for Object Oriented Design
, 1994
"... Given the central role that software development plays in the delivery and application of information technology, managers are increasingly focusing on process improvement in the software development area. This demand has spurred the provision of a number of new and/or improved approaches to softwa ..."
Abstract

Cited by 1104 (3 self)
 Add to MetaCart
of measurement principles. An automated data collection tool was then developed and implemented to collect an empirical sample of these metrics at two field sites in order to demonstrate their feasibility and suggest ways in which managers may use these metrics for process improvement.
Results 1  10
of
2,725,061