Results 11  20
of
1,395,384
Decreasing Loss Probabilities by Redundancy and Interleaving: An Analytical Study
 in « Proceedings of the 18th International Teletraffic Congress
"... This paper studies a forward error correction (FEC) scheme that reduces loss probabilities of messages, based on adding redundant packets and interleaving, as proposed in [17]. We first show that when standard redundancy schemes are used, losses of messages occur due to the following locality phenom ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper studies a forward error correction (FEC) scheme that reduces loss probabilities of messages, based on adding redundant packets and interleaving, as proposed in [17]. We first show that when standard redundancy schemes are used, losses of messages occur due to the following locality
A New Statistical Parser Based on Bigram Lexical Dependencies
, 1996
"... This paper describes a new statistical parser which is based on probabilities of dependencies between headwords in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street Journal ..."
Abstract

Cited by 491 (4 self)
 Add to MetaCart
This paper describes a new statistical parser which is based on probabilities of dependencies between headwords in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street
The Capacity of LowDensity ParityCheck Codes Under MessagePassing Decoding
, 2001
"... In this paper, we present a general method for determining the capacity of lowdensity paritycheck (LDPC) codes under messagepassing decoding when used over any binaryinput memoryless channel with discrete or continuous output alphabets. Transmitting at rates below this capacity, a randomly chos ..."
Abstract

Cited by 569 (9 self)
 Add to MetaCart
exponentially fast in the length of the code with arbitrarily small loss in rate.) Conversely, transmitting at rates above this capacity the probability of error is bounded away from zero by a strictly positive constant which is independent of the length of the code and of the number of iterations performed
Loss Probability Calculations and Asymptotic Analysis for Finite Buffer Multiplexers
, 2001
"... In this paper, we propose an approximation for the loss probability, @ A, in a finite buffer system with buffer size. Our study is motivated by the case of a highspeed network where a large number of sources are expected to be multiplexed. Hence, by appealing to Central Limit Theorem type of argum ..."
Abstract

Cited by 47 (4 self)
 Add to MetaCart
In this paper, we propose an approximation for the loss probability, @ A, in a finite buffer system with buffer size. Our study is motivated by the case of a highspeed network where a large number of sources are expected to be multiplexed. Hence, by appealing to Central Limit Theorem type
The Macroscopic Behavior of the TCP Congestion Avoidance Algorithm
, 1997
"... In this paper, we analyze a performance model for the TCP Congestion Avoidance algorithm. The model predicts the bandwidth of a sustained TCP connection subjected to light to moderate packet losses, such as loss caused by network congestion. It assumes that TCP avoids retransmission timeouts and alw ..."
Abstract

Cited by 648 (18 self)
 Add to MetaCart
In this paper, we analyze a performance model for the TCP Congestion Avoidance algorithm. The model predicts the bandwidth of a sustained TCP connection subjected to light to moderate packet losses, such as loss caused by network congestion. It assumes that TCP avoids retransmission timeouts
A HighThroughput Path Metric for MultiHop Wireless Routing
, 2003
"... This paper presents the expected transmission count metric (ETX), which finds highthroughput paths on multihop wireless networks. ETX minimizes the expected total number of packet transmissions (including retransmissions) required to successfully deliver a packet to the ultimate destination. The E ..."
Abstract

Cited by 1078 (5 self)
 Add to MetaCart
. The ETX metric incorporates the effects of link loss ratios, asymmetry in the loss ratios between the two directions of each link, and interference among the successive links of a path. In contrast, the minimum hopcount metric chooses arbitrarily among the different paths of the same minimum length
Bayes Factors
, 1995
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 1766 (74 self)
 Add to MetaCart
In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing
Exploiting Generative Models in Discriminative Classifiers
 In Advances in Neural Information Processing Systems 11
, 1998
"... Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often resu ..."
Abstract

Cited by 538 (11 self)
 Add to MetaCart
Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often
TCP Vegas: New techniques for congestion detection and avoidance
 In SIGCOMM
, 1994
"... Vegas is a new implementation of TCP that achieves between 40 and 70 % better throughput, with onefifth to onehalf the losses, as compared to the implementation of TCP in the Reno distributionof BSD Unix. This paper motivates and describes the three key techniques employed by Vegas, and presents th ..."
Abstract

Cited by 592 (3 self)
 Add to MetaCart
Vegas is a new implementation of TCP that achieves between 40 and 70 % better throughput, with onefifth to onehalf the losses, as compared to the implementation of TCP in the Reno distributionof BSD Unix. This paper motivates and describes the three key techniques employed by Vegas, and presents
Results 11  20
of
1,395,384