Results 1 - 10
of
3,511
XORs in the air: practical wireless network coding
- In Proc. ACM SIGCOMM
, 2006
"... This paper proposes COPE, a new architecture for wireless mesh networks. In addition to forwarding packets, routers mix (i.e., code) packets from different sources to increase the information content of each transmission. We show that intelligently mixing packets increases network throughput. Our de ..."
Abstract
-
Cited by 548 (20 self)
- Add to MetaCart
This paper proposes COPE, a new architecture for wireless mesh networks. In addition to forwarding packets, routers mix (i.e., code) packets from different sources to increase the information content of each transmission. We show that intelligently mixing packets increases network throughput. Our
Complete discrete 2-D Gabor transforms by neural networks for image analysis and compression
, 1988
"... A three-layered neural network is described for transforming two-dimensional discrete signals into generalized nonorthogonal 2-D “Gabor” representations for image analysis, segmentation, and compression. These transforms are conjoint spatial/spectral representations [lo], [15], which provide a comp ..."
Abstract
-
Cited by 478 (8 self)
- Add to MetaCart
complete image description in terms of locally windowed 2-D spectral coordinates embedded within global 2-D spatial coordinates. Because intrinsic redundancies within images are extracted, the resulting image codes can be very compact. However, these conjoint transforms are inherently difficult to compute
Factor Graphs and the Sum-Product Algorithm
- IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple c ..."
Abstract
-
Cited by 1791 (69 self)
- Add to MetaCart
A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple
Probabilistic Inference Using Markov Chain Monte Carlo Methods
, 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over high-dimensional spaces. R ..."
Abstract
-
Cited by 736 (24 self)
- Add to MetaCart
from data, and Bayesian learning for neural networks.
Loopy belief propagation for approximate inference: An empirical study. In:
- Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" -the use of Pearl's polytree algorithm in a Bayesian network with loops -can perform well in the context of error-correcting codes. The most dramatic instance of this is the near Shannon-limit performanc ..."
Abstract
-
Cited by 676 (15 self)
- Add to MetaCart
Abstract Recently, researchers have demonstrated that "loopy belief propagation" -the use of Pearl's polytree algorithm in a Bayesian network with loops -can perform well in the context of error-correcting codes. The most dramatic instance of this is the near Shannon
Data Mining: Concepts and Techniques
, 2000
"... Our capabilities of both generating and collecting data have been increasing rapidly in the last several decades. Contributing factors include the widespread use of bar codes for most commercial products, the computerization of many business, scientific and government transactions and managements, a ..."
Abstract
-
Cited by 3142 (23 self)
- Add to MetaCart
warehouses, and other massive information repositories. Data mining is a multidisciplinary field, drawing work from areas including database technology, artificial intelligence, machine learning, neural networks, statistics, pattern recognition, knowledge based systems, knowledge acquisition, information
Turbo decoding as an instance of Pearl’s belief propagation algorithm
- IEEE Journal on Selected Areas in Communications
, 1998
"... Abstract—In this paper, we will describe the close connection between the now celebrated iterative turbo decoding algorithm of Berrou et al. and an algorithm that has been well known in the artificial intelligence community for a decade, but which is relatively unknown to information theorists: Pear ..."
Abstract
-
Cited by 404 (16 self)
- Add to MetaCart
: Pearl’s belief propagation algorithm. We shall see that if Pearl’s algorithm is applied to the “belief network ” of a parallel concatenation of two or more codes, the turbo decoding algorithm immediately results. Unfortunately, however, this belief diagram has loops, and Pearl only proved that his
Learning Bayesian Networks is NP-Complete
, 1996
"... Algorithms for learning Bayesian networks from data havetwo components: a scoring metric and a search procedure. The scoring metric computes a score reflecting the goodness-of-fit of the structure to the data. The search procedure tries to identify network structures with high scores. Heckerman e ..."
Abstract
-
Cited by 228 (8 self)
- Add to MetaCart
Algorithms for learning Bayesian networks from data havetwo components: a scoring metric and a search procedure. The scoring metric computes a score reflecting the goodness-of-fit of the structure to the data. The search procedure tries to identify network structures with high scores. Heckerman
Hierarchical Bayesian Inference in the Visual Cortex
, 2002
"... this paper, we propose a Bayesian theory of hierarchical cortical computation based both on (a) the mathematical and computational ideas of computer vision and pattern the- ory and on (b) recent neurophysiological experimental evidence. We ,2 have proposed that Grenander's pattern theory 3 coul ..."
Abstract
-
Cited by 300 (2 self)
- Add to MetaCart
this paper, we propose a Bayesian theory of hierarchical cortical computation based both on (a) the mathematical and computational ideas of computer vision and pattern the- ory and on (b) recent neurophysiological experimental evidence. We ,2 have proposed that Grenander's pattern theory 3
Tractable inference for complex stochastic processes
- In Proc. UAI
, 1998
"... The monitoring and control of any dynamic system depends crucially on the ability to reason about its current status and its future trajectory. In the case of a stochastic system, these tasks typically involve the use of a belief state—a probability distribution over the state of the process at a gi ..."
Abstract
-
Cited by 302 (14 self)
- Add to MetaCart
given point in time. Unfortunately, the state spaces of complex processes are very large, making an explicit representation of a belief state intractable. Even in dynamic Bayesian networks (DBNs), where the process itself can be represented compactly, the representation of the belief state
Results 1 - 10
of
3,511