Results 11  20
of
1,427,973
The Laplacian Pyramid as a Compact Image Code
, 1983
"... We describe a technique for image encoding in which local operators of many scales but identical shape serve as the basis functions. The representation differs from established techniques in that the code elements are localized in spatial frequency as well as in space. Pixeltopixel correlations a ..."
Abstract

Cited by 1364 (12 self)
 Add to MetaCart
are first removed by subtracting a lowpass filtered copy of the image from the image itself. The result is a net data compression since the difference, or error, image has low variance and entropy, and the lowpass filtered image may represented at reduced sample density. Further data compression
Discovery of Grounded Theory
, 1967
"... Abstract: This paper outlines my concerns with Qualitative Data Analysis ’ (QDA) numerous remodelings of Grounded Theory (GT) and the subsequent eroding impact. I cite several examples of the erosion and summarize essential elements of classic GT methodology. It is hoped that the article will clarif ..."
Abstract

Cited by 2498 (12 self)
 Add to MetaCart
Abstract: This paper outlines my concerns with Qualitative Data Analysis ’ (QDA) numerous remodelings of Grounded Theory (GT) and the subsequent eroding impact. I cite several examples of the erosion and summarize essential elements of classic GT methodology. It is hoped that the article
Learning the Kernel Matrix with SemiDefinite Programming
, 2002
"... Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract

Cited by 767 (21 self)
 Add to MetaCart
Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information
Reconstruction and Representation of 3D Objects with Radial Basis Functions
 Computer Graphics (SIGGRAPH ’01 Conf. Proc.), pages 67–76. ACM SIGGRAPH
, 2001
"... We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from pointcloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs al ..."
Abstract

Cited by 500 (1 self)
 Add to MetaCart
allow us to model large data sets, consisting of millions of surface points, by a single RBFpreviously an impossible task. A greedy algorithm in the fitting process reduces the number of RBF centers required to represent a surface and results in significant compression and further computational
Greedy Function Approximation: A Gradient Boosting Machine
 Annals of Statistics
, 2000
"... Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed for additi ..."
Abstract

Cited by 951 (12 self)
 Add to MetaCart
for additive expansions based on any tting criterion. Specic algorithms are presented for least{squares, least{absolute{deviation, and Huber{M loss functions for regression, and multi{class logistic likelihood for classication. Special enhancements are derived for the particular case where the individual
Mixtures of Probabilistic Principal Component Analysers
, 1998
"... Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a com ..."
Abstract

Cited by 526 (6 self)
 Add to MetaCart
Principal component analysis (PCA) is one of the most popular techniques for processing, compressing and visualising data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a
Beyond Market Baskets: Generalizing Association Rules To Dependence Rules
, 1998
"... One of the more wellstudied problems in data mining is the search for association rules in market basket data. Association rules are intended to identify patterns of the type: “A customer purchasing item A often also purchases item B. Motivated partly by the goal of generalizing beyond market bask ..."
Abstract

Cited by 620 (6 self)
 Add to MetaCart
One of the more wellstudied problems in data mining is the search for association rules in market basket data. Association rules are intended to identify patterns of the type: “A customer purchasing item A often also purchases item B. Motivated partly by the goal of generalizing beyond market
Data Compression Using Adaptive Coding and Partial String Matching
 IEEE TRANSACTIONS ON COMMUNICATIONS
, 1984
"... The recently developed technique of arithmetic coding, in conjunction with a Markov model of the source, is a powerful method of data compression in situations where a linear treatment is inappropriate. Adaptive coding allows the model to be constructed dynamically by both encoder and decoder during ..."
Abstract

Cited by 439 (20 self)
 Add to MetaCart
The recently developed technique of arithmetic coding, in conjunction with a Markov model of the source, is a powerful method of data compression in situations where a linear treatment is inappropriate. Adaptive coding allows the model to be constructed dynamically by both encoder and decoder
Photobook: ContentBased Manipulation of Image Databases
, 1995
"... We describe the Photobook system, which is a set of interactive tools for browsing and searching images and image sequences. These query tools differ from those used in standard image databases in that they make direct use of the image content rather than relying on text annotations. Direct search o ..."
Abstract

Cited by 534 (0 self)
 Add to MetaCart
on image content is made possible by use of semanticspreserving image compression, which reduces images to a small set of perceptuallysignificant coefficients. We describe three types of Photobook descriptions in detail: one that allows search based on appearance, one that uses 2D shape, and a third
Quantization
 IEEE TRANS. INFORM. THEORY
, 1998
"... The history of the theory and practice of quantization dates to 1948, although similar ideas had appeared in the literature as long ago as 1898. The fundamental role of quantization in modulation and analogtodigital conversion was first recognized during the early development of pulsecode modula ..."
Abstract

Cited by 871 (12 self)
 Add to MetaCart
provide a theory for quantization as analogtodigital conversion and as data compression. Beginning with these three papers of fifty years ago, we trace the history of quantization from its origins through this decade, and we survey the fundamentals of the theory and many of the popular and promising
Results 11  20
of
1,427,973