Results 1  10
of
75
Quantum Circuit Complexity
, 1993
"... We study a complexity model of quantum circuits analogous to the standard (acyclic) Boolean circuit model. It is shown that any function computable in polynomial time by a quantum Turing machine has a polynomialsize quantum circuit. This result also enables us to construct a universal quantum compu ..."
Abstract

Cited by 278 (1 self)
 Add to MetaCart
We study a complexity model of quantum circuits analogous to the standard (acyclic) Boolean circuit model. It is shown that any function computable in polynomial time by a quantum Turing machine has a polynomialsize quantum circuit. This result also enables us to construct a universal quantum computer which can simulate, with a polynomial factor slowdown, a broader class of quantum machines than that considered by Bernstein and Vazirani [BV93], thus answering an open question raised in [BV93]. We also develop a theory of quantum communication complexity, and use it as a tool to prove that the majority function does not have a linearsize quantum formula. Keywords. Boolean circuit complexity, communication complexity, quantum communication complexity, quantum computation AMS subject classifications. 68Q05, 68Q15 1 This research was supported in part by the National Science Foundation under grant CCR9301430. 1 Introduction One of the most intriguing questions in computation theroy ...
Software Protection and Simulation on Oblivious RAMs
, 1993
"... Software protection is one of the most important issues concerning computer practice. There exist many heuristics and adhoc methods for protection, but the problem as a whole has not received the theoretical treatment it deserves. In this paper we provide theoretical treatment of software protectio ..."
Abstract

Cited by 163 (13 self)
 Add to MetaCart
Software protection is one of the most important issues concerning computer practice. There exist many heuristics and adhoc methods for protection, but the problem as a whole has not received the theoretical treatment it deserves. In this paper we provide theoretical treatment of software protection. We reduce the problem of software protection to the problem of efficient simulation on oblivious RAM. A machine is oblivious if the sequence in which it accesses memory locations is equivalent for any two inputs with the same running time. For example, an oblivious Turing Machine is one for which the movement of the heads on the tapes is identical for each computation. (Thus, it is independent of the actual input.) What is the slowdown in the running time of any machine, if it is required to be oblivious? In 1979 Pippenger and Fischer showed how a twotape oblivious Turing Machine can simulate, online, a onetape Turing Machine, with a logarithmic slowdown in the running time. We s...
Checking the Correctness of Memories
 Algorithmica
, 1995
"... We extend the notion of program checking to include programs which alter their environment. In particular, we consider programs which store and retrieve data from memory. The model we consider allows the checker a small amount of reliable memory. The checker is presented with a sequence of reques ..."
Abstract

Cited by 97 (11 self)
 Add to MetaCart
We extend the notion of program checking to include programs which alter their environment. In particular, we consider programs which store and retrieve data from memory. The model we consider allows the checker a small amount of reliable memory. The checker is presented with a sequence of requests (online) to a data structure which must reside in a large but unreliable memory. We view the data structure as being controlled by an adversary. We want the checker to perform each operation in the input sequence using its reliable memory and the unreliable data structure so that any error in the operation of the structure will be detected by the checker with high probability. We present checkers for various data structures. We prove lower bounds of log n on the amount of reliable memory needed by these checkers where n is the size of the structure. The lower bounds are information theoretic and apply under various assumptions. We also show timespace tradeoffs for checking random access memories as a generalization of those for coherent functions. 1
Robust PCPs of Proximity, Shorter PCPs and Applications to Coding
 in Proc. 36th ACM Symp. on Theory of Computing
, 2004
"... We continue the study of the tradeo between the length of PCPs and their query complexity, establishing the following main results (which refer to proofs of satis ability of circuits of size n): 1. We present PCPs of length exp( ~ O(log log n) ) n that can be veri ed by making o(log log n) ..."
Abstract

Cited by 80 (25 self)
 Add to MetaCart
We continue the study of the tradeo between the length of PCPs and their query complexity, establishing the following main results (which refer to proofs of satis ability of circuits of size n): 1. We present PCPs of length exp( ~ O(log log n) ) n that can be veri ed by making o(log log n) Boolean queries.
Issues in multiagent resource allocation
 INFORMATICA
, 2006
"... The allocation of resources within a system of autonomous agents, that not only have preferences over alternative allocations of resources but also actively participate in computing an allocation, is an exciting area of research at the interface of Computer Science and Economics. This paper is a sur ..."
Abstract

Cited by 69 (17 self)
 Add to MetaCart
The allocation of resources within a system of autonomous agents, that not only have preferences over alternative allocations of resources but also actively participate in computing an allocation, is an exciting area of research at the interface of Computer Science and Economics. This paper is a survey of some of the most salient issues in Multiagent Resource Allocation. In particular, we review various languages to represent the preferences of agents over alternative allocations of resources as well as different measures of social welfare to assess the overall quality of an allocation. We also discuss pertinent issues regarding allocation procedures and present important complexity results. Our presentation of theoretical issues is complemented by a discussion of software packages for the simulation of agentbased market places. We also introduce four major application areas for Multiagent Resource Allocation, namely industrial procurement, sharing of satellite resources, manufacturing control, and grid computing.
On the Computational Complexity of Qualitative Coalitional Games
 Artificial Intelligence
, 2004
"... We study coalitional games in which agents are each assumed to have a goal to be achieved, and where the characteristic property of a coalition is a set of choices, with each choice denoting a set of goals that would be achieved if the choice was made. Such qualitative coalitional games (QCGs) are a ..."
Abstract

Cited by 46 (15 self)
 Add to MetaCart
We study coalitional games in which agents are each assumed to have a goal to be achieved, and where the characteristic property of a coalition is a set of choices, with each choice denoting a set of goals that would be achieved if the choice was made. Such qualitative coalitional games (QCGs) are a natural tool for modelling goaloriented multiagent systems. After introducing and formally defining QCGs, we systematically formulate fourteen natural decision problems associated with them, and determine the computational complexity of these problems. For example, we formulate a notion of coalitional stability inspired by that of the core from conventional coalitional games, and prove that the problem of showing that the core of a QCG is nonempty is D 1 complete. (As an aside, we present what we believe is the first "natural" problem that is proven to be complete for D 2 .) We conclude by discussing the relationship of our work to other research on coalitional reasoning in multiagent systems, and present some avenues for future research.
NearOptimal Plans, Tractability, and Reactivity
 In Proc. of KR 94
, 1994
"... Many planning problems have recently been shown to be inherently intractable. For example, finding the shortest plan in the blocksworld domain is NPhard, and so is planning in even some of the most limited STRIPSstyle planning formalisms. We explore the question as to what extent these negative res ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
Many planning problems have recently been shown to be inherently intractable. For example, finding the shortest plan in the blocksworld domain is NPhard, and so is planning in even some of the most limited STRIPSstyle planning formalisms. We explore the question as to what extent these negative results can be attributed to the insistence on finding plans of minimal length. Using recent results form the theory of combinatorial optimization, we show that for domainindependent planning, one cannot efficiently generate any reasonable approximation of the optimal plan. Our result holds for a very restricted form of STRIPS. So, the negative complexity results for domainindependent planning are not just a consequence of searching for the optimal plans, because even finding reasonable approximations is hard. Next we consider domaindependent planning. For blocksworld planning one can generate in polynomial time good approximations of the minimal plan  within a factor of two of optimal. W...
Hierarchical Knowledge Bases and Efficient Disjunctive Reasoning
, 1989
"... We combine ideas from relationbased data management with class hierarchies to obtain Hierarchical Knowledge Bases, which have greater expressive power while maintaining the benefits of predictable and efficient information processing. We then consider the problem of reasoning with certain limited f ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
We combine ideas from relationbased data management with class hierarchies to obtain Hierarchical Knowledge Bases, which have greater expressive power while maintaining the benefits of predictable and efficient information processing. We then consider the problem of reasoning with certain limited forms of disjunctive information. We show that hierarchical knowledge bases can be used for efficient approximate reasoning with such information. The significant features of our approach include a wellconditioned trade between efficiency and accuracy, with a sound and complete limit case, and approximations guided by the structure of the domain theory. Because of the structure imposed on the knowledge base, it is possible to characterize the potential error in any approximation. 1 Introduction It is fashionable to view a knowledge base (KB) as an integral utility invoked by a problemsolving program. To be useful in tasks such as robot guidance, such a KB subsystem must perform efficiently...