Results 1 
9 of
9
Authenticated Data Structures for Graph and Geometric Searching
 IN CTRSA
, 2001
"... Following in the spirit of data structure and algorithm correctness checking, authenticated data structures provide cryptographic proofs that their answers are as accurate as the author intended, even if the data structure is being maintained by a remote host. We present techniques for authenticatin ..."
Abstract

Cited by 49 (20 self)
 Add to MetaCart
(Show Context)
Following in the spirit of data structure and algorithm correctness checking, authenticated data structures provide cryptographic proofs that their answers are as accurate as the author intended, even if the data structure is being maintained by a remote host. We present techniques for authenticating data structures that represent graphs and collection of geometric objects. We use a model where a data structure maintained by a trusted source is mirrored at distributed directories, with the directories answering queries made by users. When a user queries a directory, it receives a cryptographic proof in addition to the answer, where the proof contains statements signed by the source. The user verifies the proof trusting only the statements signed by the source. We show how to efficiently authenticate data structures for fundamental problems on networks, such as path and connectivity queries, and on geometric objects, such as intersection and containment queries.
Certifying Algorithms
, 2010
"... A certifying algorithm is an algorithm that produces, with each output, a certificate or witness (easytoverify proof) that the particular output has not been compromised by a bug. A user of a certifying algorithm inputs x, receives the output y and the certificate w, and then checks, either manual ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
A certifying algorithm is an algorithm that produces, with each output, a certificate or witness (easytoverify proof) that the particular output has not been compromised by a bug. A user of a certifying algorithm inputs x, receives the output y and the certificate w, and then checks, either manually or by use of a program, that w proves that y is a correct output for input x. In this way, he/she can be sure of the correctness of the output without having to trust the algorithm. We put forward the thesis that certifying algorithms are much superior to noncertifying algorithms, and that for complex algorithmic tasks, only certifying algorithms are satisfactory. Acceptance of this thesis would lead to a change of how algorithms are taught and how algorithms are researched. The widespread use of certifying algorithms would greatly enhance the reliability of algorithmic software. We survey the state of the art in certifying algorithms and add to it. In particular, we start a
Indexing Information for Data Forensics
, 2005
"... We introduce novel techniques for organizing the indexing structures of how data is stored so that alterations from an original version can be detected and the changed values specifically identified. We give forensic constructions for several fundamental data structures, including arrays, linked li ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
We introduce novel techniques for organizing the indexing structures of how data is stored so that alterations from an original version can be detected and the changed values specifically identified. We give forensic constructions for several fundamental data structures, including arrays, linked lists, binary search trees, skip lists, and hash tables. Some of our constructions are based on a new reducedrandomness construction for nonadaptive combinatorial group testing.
Samurai: Protecting Critical Data in Unsafe Languages ABSTRACT
"... Programs written in typeunsafe languages such as C and C++ incur costly memory errors that result in corrupted data structures, program crashes, and incorrect results. We present a datacentric solution to memory corruption called critical memory, a memory model that allows programmers to identify ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
(Show Context)
Programs written in typeunsafe languages such as C and C++ incur costly memory errors that result in corrupted data structures, program crashes, and incorrect results. We present a datacentric solution to memory corruption called critical memory, a memory model that allows programmers to identify and protect data that is critical for correct program execution. Critical memory defines operations to consistently read and update critical data, and ensures that other noncritical updates in the program will not corrupt it. We also present Samurai, a runtime system that implements critical memory in software. Samurai uses replication and forward error correction to provide probabilistic guarantees of critical memory semantics. Because Samurai does not modify memory operations on noncritical data, the majority of memory operations in programs run at full speed, and Samurai is compatible with third party libraries. Using both applications, including a Web server, and libraries (an STL list class and a memory allocator), we evaluate the performance overhead and fault tolerance that Samurai provides. We find that Samurai is a useful and practical approach for the majority of the applications and libraries considered.
Efficient authenticated data structures for graph connectivity and geometric search problems
 ALGORITHMICA
, 2010
"... Authenticated data structures provide cryptographic proofs that their answers are as accurate as the author intended, even if the data structure is being controlled by a remote untrusted host. In this paper we present efficient techniques for authenticating data structures that represent graphs and ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
(Show Context)
Authenticated data structures provide cryptographic proofs that their answers are as accurate as the author intended, even if the data structure is being controlled by a remote untrusted host. In this paper we present efficient techniques for authenticating data structures that represent graphs and collections of geometric objects. We use a dataquerying model where a data structure maintained by a trusted source is mirrored at distributed untrusted servers, called responders, with the responders answering queries made by users: when a user queries a responder, along with the answer to the issued query, he receives a cryptographic proof that allows the verification of the answer trusting only a short statement (digest) signed by the source. We introduce the path hash accumulator, a new primitive based on cryptographic hashing for efficiently authenticating various properties of structured data represented as paths, including any decomposable query over sequences of elements. We show how to employ our primitive to authenticate queries about properties of paths in graphs and search queries on multicatalogs. This allows the design of new, efficient authenticated data structures for fundamental problems on networks, such as path and connectivity queries over graphs, and complex queries on twodimensional geometric objects, such as intersection and containment queries.
From Algorithms to Working Programs On the Use of Program Checking in LEDA
 IN PROC. INT. CONF. ON MATHEMATICAL FOUNDATIONS OF COMPUTER SCIENCE (MFCS 98
, 1998
"... We report on the use of program checking in the LEDA library of efficient data types and algorithms. ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We report on the use of program checking in the LEDA library of efficient data types and algorithms.
Efficient data authentication
, 2007
"... We address the problem of authenticating data in untrusted, or adversarial, computing environments: when the distributor of the data is not the source of the data, and thus is not trusted by the enduser, how can data received be proven authentic? Data authentication constitutes a fundamental probl ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We address the problem of authenticating data in untrusted, or adversarial, computing environments: when the distributor of the data is not the source of the data, and thus is not trusted by the enduser, how can data received be proven authentic? Data authentication constitutes a fundamental problem in information security and an interesting new dimension in data management and data structure design. At the same time, the problem captures the security needs of many computing applications that exchange and use sensitive information in hostile distributed environments and its importance increases given the current trend in modern system design towards decentralized architectures with minimal trust assumptions. Solutions should not only be provably secure, but efficient and easily implementable. This dissertation presents an extensive study of data authentication. We examine the problem for both structured and unstructured data, provide formal definitions, and design new efficient techniques for authenticating general classes of query problems, such as graph and geometric search problems, and data streams. We also study the complexity of data authentication, deriving lower bounds for the important special case of authenticating set membership queries, and design new optimal constructions. Moreover, we provide a new general framework for authenticating any query over structured data, which decouples the answervalidation and answergeneration procedures. Finally, we design totally decentralized authentication structures that provide authentication for data distributed over any peertopeer overlay network.
Checking ValueSensitive Data Structures in Sublinear Space
, 2007
"... Checking valuesensitive data structures in sublinear space has been an open problem for over a decade. In this paper, we suggest a novel approach to solving it. We show that, in combination with other techniques, a previous method for checking valueinsensitive data structures in log space can be e ..."
Abstract
 Add to MetaCart
(Show Context)
Checking valuesensitive data structures in sublinear space has been an open problem for over a decade. In this paper, we suggest a novel approach to solving it. We show that, in combination with other techniques, a previous method for checking valueinsensitive data structures in log space can be extended for checking the more complicated valuesensitive data structures, using log space as well. We present the theoretical model of checking data structures and discuss the types of invasions a checker might bring to the data structure server. We also provide our idea of designing sublinear space checkers for valuesensitive data structures and give a concrete example – a log space checker for the search data structures (SDS).