Results 1  10
of
1,591,881
Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. Technical Report 2003/235, Cryptology ePrint archive, http://eprint.iacr.org, 2006. Previous version appeared at EUROCRYPT 2004
 34 [DRS07] [DS05] [EHMS00] [FJ01] Yevgeniy Dodis, Leonid Reyzin, and Adam
, 2004
"... We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying mater ..."
Abstract

Cited by 532 (38 self)
 Add to MetaCart
We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying material that, unlike traditional cryptographic keys, is (1) not reproducible precisely and (2) not distributed uniformly. We propose two primitives: a fuzzy extractor reliably extracts nearly uniform randomness R from its input; the extraction is errortolerant in the sense that R will be the same even if the input changes, as long as it remains reasonably close to the original. Thus, R can be used as a key in a cryptographic application. A secure sketch produces public information about its input w that does not reveal w, and yet allows exact recovery of w given another value that is close to w. Thus, it can be used to reliably reproduce errorprone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives for various measures of “closeness” of input data, such as Hamming distance, edit distance, and set difference.
ON A STRONG VERSION OF THE KEPLER CONJECTURE
, 2013
"... We raise and investigate the following problem which one can regard as a very close relative of the densest sphere packing problem. If the Euclidean 3space is partitioned into convex cells each containing a unit ball, how should the shapes of the cells be designed to minimize the average surface ar ..."
Abstract
 Add to MetaCart
We raise and investigate the following problem which one can regard as a very close relative of the densest sphere packing problem. If the Euclidean 3space is partitioned into convex cells each containing a unit ball, how should the shapes of the cells be designed to minimize the average surface area of the cells? In particular, we prove that the average surface area in question is always at least √24 = 13.8564.... 3
Network Time Protocol (Version 3) Specification, Implementation and Analysis
, 1992
"... Note: This document consists of an approximate rendering in ASCII of the PostScript document of the same name. It is provided for convenience and for use in searches, etc. However, most tables, figures, equations and captions have not been rendered and the pagination and section headings are not ava ..."
Abstract

Cited by 522 (18 self)
 Add to MetaCart
Note: This document consists of an approximate rendering in ASCII of the PostScript document of the same name. It is provided for convenience and for use in searches, etc. However, most tables, figures, equations and captions have not been rendered and the pagination and section headings are not available. This document describes the Network Time Protocol (NTP), specifies its formal structure and summarizes information useful for its implementation. NTP provides the mechanisms to synchronize time and coordinate time distribution in a large, diverse internet operating at rates from mundane to lightwave. It uses a returnabletime design in which a distributed subnet of time servers operating in a selforganizing, hierarchicalmasterslave configuration synchronizes local clocks within the subnet and to national time standards via wire or radio. The servers can also redistribute reference time via local routing algorithms and time daemons. Status of this Memo This RFC specifies an IAB standards track protocol for the Internet community and requests discussion and suggestions for improvements. Please refer to the current edition of the <169>IAB Official Protocol Standards<170> for the standardization state and status of this protocol. Distribution of this memo is unlimited.
Leibniz Filters and the Strong Version of a Protoalgebraic Logic
"... A filter of a sentential logic S is Leibniz when it is the smallest one among all the Sfilters on the same algebra having the same Leibniz congruence. This paper studies these filters and the sentential logic S + + + defined by the class of all Smatrices whose filter is Leibniz, which is ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
is called the strong version of S , in the context of protoalgebraic logics with theorems. Topics studied include an enhanced Correspondence Theorem, characterizations of the weak algebraizability of S + + + and of the explicit definability of Leibniz filters, and several theorems of transfer
A Theory of Focus Interpretation
"... More or less final version. To appear in Natural Language Semantics. According to the alternative semantics for focus, the semantic reflex of intonational focus is a second semantic value, which in the case of a sentence is a set of propositions. We examine a range of semantic and pragmatic applicat ..."
Abstract

Cited by 462 (6 self)
 Add to MetaCart
applications of the theory, and extract a unitary principle specifying how the focus semantic value interacts with semantic and pragmatic processes. A strong version of the theory has the effect of making lexical or constructionspecific stipulation of a focusrelated effect in association with focus
Depth first search and linear graph algorithms
 SIAM JOURNAL ON COMPUTING
, 1972
"... The value of depthfirst search or "backtracking" as a technique for solving problems is illustrated by two examples. An improved version of an algorithm for finding the strongly connected components of a directed graph and ar algorithm for finding the biconnected components of an undirect ..."
Abstract

Cited by 1384 (19 self)
 Add to MetaCart
The value of depthfirst search or "backtracking" as a technique for solving problems is illustrated by two examples. An improved version of an algorithm for finding the strongly connected components of a directed graph and ar algorithm for finding the biconnected components
Fairness and Retaliation: The Economics of Reciprocity
 JOURNAL OF ECONOMIC PERSPECTIVES
, 2000
"... This paper shows that reciprocity has powerful implications for many economic domains. It is an important determinant in the enforcement of contracts and social norms and enhances the possibilities of collective action greatly. Reciprocity may render the provision of explicit incentive inefficient b ..."
Abstract

Cited by 553 (12 self)
 Add to MetaCart
because the incentives may crowd out voluntary cooperation. It strongly limits the effects of competition in markets with incomplete contracts and gives rise to noncompetitive wage differences. Finally, reciprocity it is also a strong force contributing to the existence of incomplete contracts.
An evaluation of statistical approaches to text categorization
 Journal of Information Retrieval
, 1999
"... Abstract. This paper focuses on a comparative evaluation of a widerange of text categorization methods, including previously published results on the Reuters corpus and new results of additional experiments. A controlled study using three classifiers, kNN, LLSF and WORD, was conducted to examine th ..."
Abstract

Cited by 664 (23 self)
 Add to MetaCart
the impact of configuration variations in five versions of Reuters on the observed performance of classifiers. Analysis and empirical evidence suggest that the evaluation results on some versions of Reuters were significantly affected by the inclusion of a large portion of unlabelled documents, mading those
SemiSupervised Learning Literature Survey
, 2006
"... We review the literature on semisupervised learning, which is an area in machine learning and more generally, artificial intelligence. There has been a whole
spectrum of interesting ideas on how to learn from both labeled and unlabeled data, i.e. semisupervised learning. This document is a chapter ..."
Abstract

Cited by 757 (8 self)
 Add to MetaCart
chapter excerpt from the author’s
doctoral thesis (Zhu, 2005). However the author plans to update the online version frequently to incorporate the latest development in the field. Please obtain the latest
version at http://www.cs.wisc.edu/~jerryzhu/pub/ssl_survey.pdf
Principles and Practice in Second Language Acquisition
, 1982
"... This is the original version of Principles and Practice, as published in 1982, with only minor changes. It is gratifying to point out that many of the predictions made in this book were confirmed by subsequent research, for example, the superiority of comprehensibleinput based methods and sheltered ..."
Abstract

Cited by 717 (4 self)
 Add to MetaCart
This is the original version of Principles and Practice, as published in 1982, with only minor changes. It is gratifying to point out that many of the predictions made in this book were confirmed by subsequent research, for example, the superiority of comprehensibleinput based methods
Results 1  10
of
1,591,881