Results 1  10
of
2,960,535
Using Daily Stock Returns: The Case of Event Studies
 Journal of Financial Economics
, 1985
"... This paper examines properties of daily stock returns and how the particular characteristics of these data affect event study methodologies. Daily data generally present few difficulties for event studies. Standard procedures are typically wellspecified even when special daily data characteristics ..."
Abstract

Cited by 763 (2 self)
 Add to MetaCart
This paper examines properties of daily stock returns and how the particular characteristics of these data affect event study methodologies. Daily data generally present few difficulties for event studies. Standard procedures are typically wellspecified even when special daily data characteristics are ignored. However, recognition of autocorrelation in daily excess returns and changes in their variance conditional on an event can sometimes be advantageous. In addition, tests ignoring crosssectional dependence can be wellspecified and have higher power than tests which account for potential dependence. 1.
Compressed sensing
 IEEE Trans. Inf. Theory
, 2006
"... We study the notion of Compressed Sensing (CS) as put forward in [14] and related work [20, 3, 4]. The basic idea behind CS is that a signal or image, unknown but supposed to be compressible by a known transform, (eg. wavelet or Fourier), can be subjected to fewer measurements than the nominal numbe ..."
Abstract

Cited by 3600 (24 self)
 Add to MetaCart
norm. We perform a series of numerical experiments which validate in general terms the basic idea proposed in [14, 3, 5], in the favorable case where the transform coefficients are sparse in the strong sense that the vast majority are zero. We then consider a range of lessfavorable cases, in which
Bayesian Interpolation
 Neural Computation
, 1991
"... Although Bayesian analysis has been in use since Laplace, the Bayesian method of modelcomparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and modelcomparison is demonstrated by studying the inference problem of interpolating noisy data. T ..."
Abstract

Cited by 721 (17 self)
 Add to MetaCart
. The concepts and methods described are quite general and can be applied to many other problems. Regularising constants are set by examining their posterior probability distribution. Alternative regularisers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them
Implementing data cubes efficiently
 In SIGMOD
, 1996
"... Decision support applications involve complex queries on very large databases. Since response times should be small, query optimization is critical. Users typically view the data as multidimensional data cubes. Each cell of the data cube is a view consisting of an aggregation of interest, like total ..."
Abstract

Cited by 545 (1 self)
 Add to MetaCart
to materialize. The greedy algorithm performs within a small constant factor of optimal under a variety of models. We then consider the most common case of the hypercube lattice and examine the choice of materialized views for hypercubes in detail, giving some good tradeoffs between the space used
Satisfaction and Comparison Income
 Journal of Public Economics
, 1995
"... This paper is an attempt to test the hypothesis that utility depends on income relative to a 'comparison' or reference level. Using data on 5,000 British workers, it provides two findings. First, workers' reported satisfaction levels are shown to be inversely related to their comparis ..."
Abstract

Cited by 616 (55 self)
 Add to MetaCart
to their comparison wage rates. Second, holding income constant, satisfaction levels are shown to be strongly declining in the level of education. More generally, the paper tries to help begin the task of constructing an economics of job satisfaction.
Discovery of Grounded Theory
, 1967
"... Abstract: This paper outlines my concerns with Qualitative Data Analysis ’ (QDA) numerous remodelings of Grounded Theory (GT) and the subsequent eroding impact. I cite several examples of the erosion and summarize essential elements of classic GT methodology. It is hoped that the article will clarif ..."
Abstract

Cited by 2485 (12 self)
 Add to MetaCart
will clarify my concerns with the continuing enthusiasm but misunderstood embrace of GT by QDA methodologists and serve as a preliminary guide to novice researchers who wish to explore the fundamental principles of GT. Key words: grounded theory, qualitative data analysis, constant comparative method
Marching cubes: A high resolution 3D surface construction algorithm
 COMPUTER GRAPHICS
, 1987
"... We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divideandconquer approach to generate interslice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical d ..."
Abstract

Cited by 2675 (4 self)
 Add to MetaCart
We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divideandconquer approach to generate interslice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical
Private Information Retrieval
, 1997
"... Publicly accessible databases are an indispensable resource for retrieving up to date information. But they also pose a significant risk to the privacy of the user, since a curious database operator can follow the user's queries and infer what the user is after. Indeed, in cases where the user ..."
Abstract

Cited by 559 (14 self)
 Add to MetaCart
Publicly accessible databases are an indispensable resource for retrieving up to date information. But they also pose a significant risk to the privacy of the user, since a curious database operator can follow the user's queries and infer what the user is after. Indeed, in cases where
The Capacity of LowDensity ParityCheck Codes Under MessagePassing Decoding
, 2001
"... In this paper, we present a general method for determining the capacity of lowdensity paritycheck (LDPC) codes under messagepassing decoding when used over any binaryinput memoryless channel with discrete or continuous output alphabets. Transmitting at rates below this capacity, a randomly chos ..."
Abstract

Cited by 569 (9 self)
 Add to MetaCart
exponentially fast in the length of the code with arbitrarily small loss in rate.) Conversely, transmitting at rates above this capacity the probability of error is bounded away from zero by a strictly positive constant which is independent of the length of the code and of the number of iterations performed
Optimal Aggregation Algorithms for Middleware
 IN PODS
, 2001
"... Assume that each object in a database has m grades, or scores, one for each of m attributes. For example, an object can have a color grade, that tells how red it is, and a shape grade, that tells how round it is. For each attribute, there is a sorted list, which lists each object and its grade under ..."
Abstract

Cited by 701 (4 self)
 Add to MetaCart
must access every object in the database, to find its grade under each attribute. Fagin has given an algorithm (“Fagin’s Algorithm”, or FA) that is much more efficient. For some monotone aggregation functions, FA is optimal with high probability in the worst case. We analyze an elegant and remarkably
Results 1  10
of
2,960,535