Results 1  10
of
76,910
DecisionTheoretic Planning: Structural Assumptions and Computational Leverage
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1999
"... Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives ..."
Abstract

Cited by 510 (4 self)
 Add to MetaCart
Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions
Timing Attacks on Implementations of DiffieHellman, RSA, DSS, and Other Systems
, 1996
"... By carefully measuring the amount of time required to perform private key operations, attackers may be able to find fixed DiffieHellman exponents, factor RSA keys, and break other cryptosystems. Against a vulnerable system, the attack is computationally inexpensive and often requires only known cip ..."
Abstract

Cited by 644 (3 self)
 Add to MetaCart
ciphertext. Actual systems are potentially at risk, including cryptographic tokens, networkbased cryptosystems, and other applications where attackers can make reasonably accurate timing measurements. Techniques for preventing the attack for RSA and DiffieHellman are presented. Some cryptosystems will need
Algorithms for Quantum Computation: Discrete Logarithms and Factoring
, 1994
"... A computer is generally considered to be a universal computational device; i.e., it is believed able to simulate any physical computational device with a increase in computation time of at most a polynomial factor. It is not clear whether this is still true when quantum mechanics is taken into consi ..."
Abstract

Cited by 1103 (7 self)
 Add to MetaCart
of steps which is polynomial in the input size, e.g., the number of digits of the integer to be factored. These two problems are generally considered hard on a classical computer and have been used as the basis of several proposed cryptosystems. (We thus give the first examples of quantum cryptanalysis.) 1
Good ErrorCorrecting Codes based on Very Sparse Matrices
, 1999
"... We study two families of errorcorrecting codes defined in terms of very sparse matrices. "MN" (MacKayNeal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract

Cited by 741 (23 self)
 Add to MetaCart
We study two families of errorcorrecting codes defined in terms of very sparse matrices. "MN" (MacKayNeal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties
IdentityBased Encryption from the Weil Pairing
, 2001
"... We propose a fully functional identitybased encryption scheme (IBE). The scheme has chosen ciphertext security in the random oracle model assuming an elliptic curve variant of the computational DiffieHellman problem. Our system is based on bilinear maps between groups. The Weil pairing on elliptic ..."
Abstract

Cited by 1699 (29 self)
 Add to MetaCart
We propose a fully functional identitybased encryption scheme (IBE). The scheme has chosen ciphertext security in the random oracle model assuming an elliptic curve variant of the computational DiffieHellman problem. Our system is based on bilinear maps between groups. The Weil pairing
A Digital Signature Scheme Secure Against Adaptive ChosenMessage Attacks
, 1995
"... We present a digital signature scheme based on the computational diculty of integer factorization. The scheme possesses the novel property of being robust against an adaptive chosenmessage attack: an adversary who receives signatures for messages of his choice (where each message may be chosen in a ..."
Abstract

Cited by 985 (43 self)
 Add to MetaCart
were considered in the folklore to be contradictory. More generally, we show how to construct a signature scheme with such properties based on the existence of a "clawfree" pair of permutations  a potentially weaker assumption than the intractibility of integer factorization. The new scheme
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 583 (14 self)
 Add to MetaCart
Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical
Graphbased algorithms for Boolean function manipulation
 IEEE TRANSACTIONS ON COMPUTERS
, 1986
"... In this paper we present a new data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations introduced by Lee [1] and Akers [2], but with further restrictions on th ..."
Abstract

Cited by 3499 (47 self)
 Add to MetaCart
In this paper we present a new data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations introduced by Lee [1] and Akers [2], but with further restrictions on the ordering of decision variables in the graph. Although a function requires, in the worst case, a graph of size exponential in the number of arguments, many of the functions encountered in typical applications have a more reasonable representation. Our algorithms have time complexity proportional to the sizes of the graphs being operated on, and hence are quite efficient as long as the graphs do not grow too large. We present experimental results from applying these algorithms to problems in logic design verification that demonstrate the practicality of our approach.
Text Classification from Labeled and Unlabeled Documents using EM
 MACHINE LEARNING
, 1999
"... This paper shows that the accuracy of learned text classifiers can be improved by augmenting a small number of labeled training documents with a large pool of unlabeled documents. This is important because in many text classification problems obtaining training labels is expensive, while large qua ..."
Abstract

Cited by 1033 (19 self)
 Add to MetaCart
quantities of unlabeled documents are readily available. We introduce an algorithm for learning from labeled and unlabeled documents based on the combination of ExpectationMaximization (EM) and a naive Bayes classifier. The algorithm first trains a classifier using the available labeled documents
Short signatures from the Weil pairing
, 2001
"... Abstract. We introduce a short signature scheme based on the Computational DiffieHellman assumption on certain elliptic and hyperelliptic curves. The signature length is half the size of a DSA signature for a similar level of security. Our short signature scheme is designed for systems where signa ..."
Abstract

Cited by 743 (28 self)
 Add to MetaCart
Abstract. We introduce a short signature scheme based on the Computational DiffieHellman assumption on certain elliptic and hyperelliptic curves. The signature length is half the size of a DSA signature for a similar level of security. Our short signature scheme is designed for systems where
Results 1  10
of
76,910