Results 1  10
of
2,244,435
MAFFT version 5: improvement in accuracy of multiple sequence alignment
 NUCLEIC ACIDS RES
, 2005
"... The accuracy of multiple sequence alignment program MAFFT has been improved. The new version (5.3) of MAFFT offers new iterative refinement options, HINSi, FINSi and GINSi, in which pairwise alignment information are incorporated into objective function. These new options of MAFFT showed high ..."
Abstract

Cited by 801 (5 self)
 Add to MetaCart
The accuracy of multiple sequence alignment program MAFFT has been improved. The new version (5.3) of MAFFT offers new iterative refinement options, HINSi, FINSi and GINSi, in which pairwise alignment information are incorporated into objective function. These new options of MAFFT showed
Optimal Aggregation Algorithms for Middleware
 IN PODS
, 2001
"... Assume that each object in a database has m grades, or scores, one for each of m attributes. For example, an object can have a color grade, that tells how red it is, and a shape grade, that tells how round it is. For each attribute, there is a sorted list, which lists each object and its grade under ..."
Abstract

Cited by 714 (4 self)
 Add to MetaCart
Assume that each object in a database has m grades, or scores, one for each of m attributes. For example, an object can have a color grade, that tells how red it is, and a shape grade, that tells how round it is. For each attribute, there is a sorted list, which lists each object and its grade
Advances in Prospect Theory: Cumulative Representation of Uncertainty
 JOURNAL OF RISK AND UNCERTAINTY, 5:297323 (1992)
, 1992
"... We develop a new version of prospect theory that employs cumulative rather than separable decision weights and extends the theory in several respects. This version, called cumulative prospect theory, applies to uncertain as well as to risky prospects with any number of outcomes, and it allows differ ..."
Abstract

Cited by 1710 (17 self)
 Add to MetaCart
We develop a new version of prospect theory that employs cumulative rather than separable decision weights and extends the theory in several respects. This version, called cumulative prospect theory, applies to uncertain as well as to risky prospects with any number of outcomes, and it allows
Common Risk Factors in the Returns On Stocks And Bonds
 Journal of Financial Economics
, 1993
"... This paper identities five common risk factors in the returns on stocks and bonds. There are three stockmarket factors: an overall market factor and factors related to firm size and booktomarket equity. There are two bondmarket factors. related to maturity and default risks. Stock returns have s ..."
Abstract

Cited by 2214 (33 self)
 Add to MetaCart
shared variation due to the stockmarket factors, and they are linked to bond returns through shared variation in the bondmarket factors. Except for lowgrade corporates. the bondmarket factors capture the common variation in bond returns. Most important. the five factors seem to explain average
Understanding Normal and Impaired Word Reading: Computational Principles in QuasiRegular Domains
 PSYCHOLOGICAL REVIEW
, 1996
"... We develop a connectionist approach to processing in quasiregular domains, as exemplified by English word reading. A consideration of the shortcomings of a previous implementation (Seidenberg & McClelland, 1989, Psych. Rev.) in reading nonwords leads to the development of orthographic and phono ..."
Abstract

Cited by 609 (94 self)
 Add to MetaCart
and phonological representations that capture better the relevant structure among the written and spoken forms of words. In a number of simulation experiments, networks using the new representations learn to read both regular and exception words, including lowfrequency exception words, and yet are still able
Eliciting selfexplanations improves understanding
 Cognitive Science
, 1994
"... Learning involves the integration of new information into existing knowledge. Generoting explanations to oneself (selfexplaining) facilitates that integration process. Previously, selfexplanation has been shown to improve the acquisition of problemsolving skills when studying workedout examples. ..."
Abstract

Cited by 572 (22 self)
 Add to MetaCart
. This study extends that finding, showing that selfexplanation can also be facilitative when it is explicitly promoted, in the context of learning declarative knowledge from an expository text. Without any extensive training, 14 eighthgrade students were merely asked to selfexplain after reading each line
A fast learning algorithm for deep belief nets
 Neural Computation
, 2006
"... We show how to use “complementary priors ” to eliminate the explaining away effects that make inference difficult in denselyconnected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a ..."
Abstract

Cited by 968 (49 self)
 Add to MetaCart
at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorithm is used to initialize a slower learning procedure that finetunes the weights using a contrastive version of the wakesleep algorithm. After finetuning, a network with three hidden layers forms a
Fully homomorphic encryption using ideal lattices
 In Proc. STOC
, 2009
"... We propose a fully homomorphic encryption scheme – i.e., a scheme that allows one to evaluate circuits over encrypted data without being able to decrypt. Our solution comes in three steps. First, we provide a general result – that, to construct an encryption scheme that permits evaluation of arbitra ..."
Abstract

Cited by 664 (17 self)
 Add to MetaCart
of arbitrary circuits, it suffices to construct an encryption scheme that can evaluate (slightly augmented versions of) its own decryption circuit; we call a scheme that can evaluate its (augmented) decryption circuit bootstrappable. Next, we describe a public key encryption scheme using ideal lattices
Segmentation of brain MR images through a hidden Markov random field model and the expectationmaximization algorithm
 IEEE TRANSACTIONS ON MEDICAL. IMAGING
, 2001
"... The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogrambased model, the FM has an intrinsic limi ..."
Abstract

Cited by 637 (14 self)
 Add to MetaCart
limitation—no spatial information is taken into account. This causes the FM model to work only on welldefined images with low levels of noise; unfortunately, this is often not the the case due to artifacts such as partial volume effect and bias field distortion. Under these conditions, FM model
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 674 (15 self)
 Add to MetaCart
;belief revision") version, Weiss For the case of networks with multiple loops, Richard son To summarize, what is currently known about loopy propagation is that ( 1) it works very well in an error correcting code setting and (2) there are conditions for a singleloop network for which it can be guaranteed
Results 1  10
of
2,244,435