Results 1  10
of
11
Data Security
, 1979
"... The rising abuse of computers and increasing threat to personal privacy through data banks have stimulated much interest m the techmcal safeguards for data. There are four kinds of safeguards, each related to but distract from the others. Access controls regulate which users may enter the system and ..."
Abstract

Cited by 611 (3 self)
 Add to MetaCart
The rising abuse of computers and increasing threat to personal privacy through data banks have stimulated much interest m the techmcal safeguards for data. There are four kinds of safeguards, each related to but distract from the others. Access controls regulate which users may enter the system and subsequently whmh data sets an active user may read or wrote. Flow controls regulate the dissemination of values among the data sets accessible to a user. Inference controls protect statistical databases by preventing questioners from deducing confidential information by posing carefully designed sequences of statistical queries and correlating the responses. Statlstmal data banks are much less secure than most people beheve. Data encryption attempts to prevent unauthorized disclosure of confidential information in transit or m storage. This paper describes the general nature of controls of each type, the kinds of problems they can and cannot solve, and their inherent limitations and weaknesses. The paper is intended for a general audience with little background in the area.
The NPcompleteness column: an ongoing guide
 JOURNAL OF ALGORITHMS
, 1987
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NPCompleteness," W. H. Freem ..."
Abstract

Cited by 242 (0 self)
 Add to MetaCart
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NPCompleteness," W. H. Freeman & Co., New York, 1979 (hereinafter referred to as "[G&J]"; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
The Shannon Cipher System with a Guessing Wiretapper
 IEEE Trans. Inform. Theory
, 1998
"... The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at m ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
(Show Context)
The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at maximizing the guessing effort, the wiretapper strives to minimize it, e.g., by ordering guesses according to descending order of posterior probabilities of plaintexts given the cryptogram. For a memoryless plaintext source and a given key rate, a singleletter characterization is given for the highest achievable guessing exponent function, that is, the exponential rate of the ae th moment of the number of guesses as a function of the plaintext message length. Moreover, we demonstrate asymptotically optimal strategies for both encryption and guessing, which are universal in the sense of being independent of the statistics of the source. The guessing exponent is then investigated as a functi...
AverageCase Complexity
 in Foundations and Trends in Theoretical Computer Science Volume 2, Issue 1
, 2006
"... We survey the averagecase complexity of problems in NP. We discuss various notions of goodonaverage algorithms, and present completeness results due to Impagliazzo and Levin. Such completeness results establish the fact that if a certain specific (but somewhat artificial) NP problem is easyonav ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
We survey the averagecase complexity of problems in NP. We discuss various notions of goodonaverage algorithms, and present completeness results due to Impagliazzo and Levin. Such completeness results establish the fact that if a certain specific (but somewhat artificial) NP problem is easyonaverage with respect to the uniform distribution, then all problems in NP are easyonaverage with respect to all samplable distributions. Applying the theory to natural distributional problems remain an outstanding open question. We review some natural distributional problems whose averagecase complexity is of particular interest and that do not yet fit into this theory. A major open question is whether the existence of hardonaverage problems in NP can be based on the P ̸ = NP assumption or on related worstcase assumptions. We review negative results showing that certain proof techniques cannot prove such a result. While the relation between worstcase and averagecase complexity for general NP problems remains open, there has been progress in understanding the relation between different “degrees ” of averagecase complexity. We discuss some of these “hardness amplification ” results. 1
Modeling the Storage Architectures of Commercial Database Systems
 ACM Transactions on Database Systems
, 1985
"... Modeling the storage structures of a DBMS is a prerequisite to understanding and optimizing database performance. Previously, such modeling was very difficult because the fundamental role of conceptualtointernal mappings in DBMS implementations went unrecognized. In this paper we present a model o ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
Modeling the storage structures of a DBMS is a prerequisite to understanding and optimizing database performance. Previously, such modeling was very difficult because the fundamental role of conceptualtointernal mappings in DBMS implementations went unrecognized. In this paper we present a model of physical databases, called the transformation model, that makes conceptualtointernal mappings explicit. By exposing such mappings, we show that it is possible to model the storage architectures (i.e., the storage structures and mappings) of many commercial DBMSs in a precise, systematic, and comprehendible way. Models of the INQUIRE, ADABAS, and SYSTEM 2000 storage architectures are presented as examples of the model’s utility. We believe the transformation model helps bridge the gap between physical database theory and practice. It also reveals the possibility of a technology to automate the development of physical database software.
Invariantbased Cryptosystems and Their Security Against Provable WorstCase Break
"... Cryptography based on noncommutative algebra still suffers from lack of schemes and lack of interest. In this work, we show new constructions of cryptosystems based on group invariants and suggest methods to make such cryptosystems secure in practice. Cryptographers still cannot prove security in i ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Cryptography based on noncommutative algebra still suffers from lack of schemes and lack of interest. In this work, we show new constructions of cryptosystems based on group invariants and suggest methods to make such cryptosystems secure in practice. Cryptographers still cannot prove security in its cryptographic sense or even reduce it to some statement about regular complexity classes. In this paper we introduce a new notion of cryptographic security, a provable break, and prove that cryptosystems based on matrix group invariants and also a variation of the AnshelAnshelGoldfeld key agreement protocol for modular groups are secure against provable worstcase break unless NP ⊆ RP.
Perfectly Secure Encryption of Individual Sequences ∗
"... In analogy to the well–known notion of finite–state compressibility of individual sequences, due to Lempel and Ziv, we define a similar notion of “finite–state encryptability ” of an individual plaintext sequence, as the minimum asymptotic key rate that must be consumed by finite–state encrypters so ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
In analogy to the well–known notion of finite–state compressibility of individual sequences, due to Lempel and Ziv, we define a similar notion of “finite–state encryptability ” of an individual plaintext sequence, as the minimum asymptotic key rate that must be consumed by finite–state encrypters so as to guarantee perfect secrecy in a well–defined sense. Our main basic result is that the finite–state encryptability is equal to the finite–state compressibility for every individual sequence. This is in parallelism to Shannon’s classical probabilistic counterpart result, asserting that the minimum required key rate is equal to the entropy rate of the source. However, the redundancy, defined as the gap between the upper bound (direct part) and the lower bound (converse part) in the encryption problem, turns out to decay at a different rate (in fact, much slower) than the analogous redundancy associated with the compression problem. We also extend our main theorem in several directions, allowing: (i) availability of side information (SI) at the encrypter/decrypter/eavesdropper, (ii) lossy reconstruction at the decrypter, and (iii) the combination of both lossy reconstruction and SI, in the spirit of the Wyner–Ziv problem. Index Terms: Information–theoretic security, Shannon’s cipher system, secret key, perfect secrecy, individual sequences, finite–state machine, compressibility, incremental parsing, Lempel–
Algebraic cryptography: new constructions and their security against provable break
, 2008
"... Very few known cryptographic primitives are based on noncommutative algebra. Each new scheme is of substantial interest, because noncommutative constructions are secure against many standard cryptographic attacks. On the other hand, cryptography does not provide security proofs that would allow to b ..."
Abstract
 Add to MetaCart
(Show Context)
Very few known cryptographic primitives are based on noncommutative algebra. Each new scheme is of substantial interest, because noncommutative constructions are secure against many standard cryptographic attacks. On the other hand, cryptography does not provide security proofs that would allow to base the security of a cryptographic primitive on structural complexity assumptions. Thus, it is important to investigate weaker notions of security. In this paper we introduce new constructions of cryptographic primitives based on group invariants and o er new ways to strengthen them for practical use. Besides, we introduce the notion of provable break which is a weaker version of the regular cryptographic break. In this version, an adversary should have a proof that he has correctly decyphered the message. We prove that cryptosystems based on matrix groups invariants and a version of the AnshelAnshelGoldfeld key agreement protocol for modular groups are secure against provable break unless NP = RP.
A New View on WorstCase to AverageCase Reductions for NP Problems
, 2014
"... ar ..."
(Show Context)