Results 1  10
of
60,810
On an Implementation of the SolovayKitaev Algorithm
"... In quantum computation we are given a finite set of gates and we have to perform a desired operation as a product of them. The corresponding computational problem is approximating an arbitrary unitary as a product in a topological generating set of SU(d). The problem is known to be solvable in time ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
. In this paper we present methods which make the implementation of the existing algorithms easier. We present heuristic methods which make a timelength tradeoff in the preparatory step. We decrease the running time and the used memory to polynomial in d but the length of the products approximating the desired
QUANTUM CIRCUIT SYNTHESIS USING SOLOVAYKITAEV ALGORITHM AND OPTIMIZATION TECHNIQUES by
"... Quantum circuit synthesis is one of the major areas of current research in the field of quantum computing. Analogous to its Boolean counterpart, the task involves constructing arbitrary quantum gates using only those available within a small set of universal gates that can be realized physically. Ho ..."
Abstract
 Add to MetaCart
on the SolovayKitaev algorithm. The crux of the SolovayKitaev algorithm is the use of a procedure to decompose a given quantum gate into a pair of group commutators with the pair being synthesized separately. The SolovayKitaev algorithm involves group commutator decomposition in a recursive manner, with a
Stochastic Inversion Transduction Grammars and Bilingual Parsing of Parallel Corpora
, 1997
"... ..."
Trade Policy and Economic Growth: A Skeptic's Guide to the CrossNational Evidence
 Macroeconomics Annual 2000, Ben Bemanke and
, 2000
"... Andrew Warner for generously sharing their data with us. We are particularly grateful to BenDavid, Frankel, Romer, Sachs, Warner and Romain Wacziarg for helpful email exchanges. We have benefited greatly from discussions in seminars at the University of California at Berkeley, ..."
Abstract

Cited by 1013 (25 self)
 Add to MetaCart
Andrew Warner for generously sharing their data with us. We are particularly grateful to BenDavid, Frankel, Romer, Sachs, Warner and Romain Wacziarg for helpful email exchanges. We have benefited greatly from discussions in seminars at the University of California at Berkeley,
Efficient similarity search in sequence databases
, 1994
"... We propose an indexing method for time sequences for processing similarity queries. We use the Discrete Fourier Transform (DFT) to map time sequences to the frequency domain, the crucial observation being that, for most sequences of practical interest, only the first few frequencies are strong. Anot ..."
Abstract

Cited by 505 (21 self)
 Add to MetaCart
. Another important observation is Parseval's theorem, which specifies that the Fourier transform preserves the Euclidean distance in the time or frequency domain. Having thus mapped sequences to a lowerdimensionality space by using only the first few Fourier coe cients, we use Rtrees to index
On the impossibility of informationally efficient markets
 AMERICAN ECONOMIC REVIEW
, 1980
"... ..."
Expectations and the Neutrality of Money
 JOURNAL OF ECONOMIC THEORY
, 1972
"... This paper provides a simple example of an economy in which equilibrium prices and quantities exhibit what may be the central feature of the modern business cycle: a systematic relation between the rate of change in nominal prices and the level of real output. The relationship, ..."
Abstract

Cited by 858 (5 self)
 Add to MetaCart
This paper provides a simple example of an economy in which equilibrium prices and quantities exhibit what may be the central feature of the modern business cycle: a systematic relation between the rate of change in nominal prices and the level of real output. The relationship,
Compressive sampling
, 2006
"... Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired res ..."
Abstract

Cited by 1427 (15 self)
 Add to MetaCart
Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired resolution of the image, i.e. the number of pixels in the image. This paper surveys an emerging theory which goes by the name of “compressive sampling” or “compressed sensing,” and which says that this conventional wisdom is inaccurate. Perhaps surprisingly, it is possible to reconstruct images or signals of scientific interest accurately and sometimes even exactly from a number of samples which is far smaller than the desired resolution of the image/signal, e.g. the number of pixels in the image. It is believed that compressive sampling has far reaching implications. For example, it suggests the possibility of new data acquisition protocols that translate analog information into digital form with fewer sensors than what was considered necessary. This new sampling theory may come to underlie procedures for sampling and compressing data simultaneously. In this short survey, we provide some of the key mathematical insights underlying this new theory, and explain some of the interactions between compressive sampling and other fields such as statistics, information theory, coding theory, and theoretical computer science.
Noise Trader Risk in Financial Markets
 Jolurnial of Political Economy
, 1990
"... We present a simple overlapping generations model of an asset market in which irrational noise traders with erroneous stochastic beliefs both affect prices and earn higher expected returns. The unpredictability of noise traders ’ beliefs creates a risk in the price of the asset that deters rational ..."
Abstract

Cited by 858 (23 self)
 Add to MetaCart
We present a simple overlapping generations model of an asset market in which irrational noise traders with erroneous stochastic beliefs both affect prices and earn higher expected returns. The unpredictability of noise traders ’ beliefs creates a risk in the price of the asset that deters rational arbitrageurs from aggressively betting against them. As a result, prices can diverge significantly from fundamental values even in the absence of fundamental risk. Moreover, bearing a disproportionate amount of risk that they themselves create enables noise traders to earn a higher expected return than do rational investors. The model sheds light on a number of financial anomalies, including the excess volatility of asset prices, the mean reversion of stock returns, the underpricing of closed end mutual funds, and the MehraPrescott equity premium puzzle. 3“If the reader interjects that there must surely be large profits to be gained... in the long run by a skilled individual who... purchase[s] investments on the best genuine longterm expectation he can frame, he must be answered... that there are such seriousminded individuals and that it makes a vast difference to an investment market whether or not they predominate... But we must also add that there are several factors which jeopardise the predominance of such individuals in modern investment markets. Investment based on genuine longterm expectation is so difficult... as to be
Results 1  10
of
60,810