Results 1  10
of
9,801
Routing in a delay tolerant network
 Proceedings of ACM Sigcomm
, 2004
"... We formulate the delaytolerant networking routing problem, where messages are to be moved endtoend across a connectivity graph that is timevarying but whose dynamics may be known in advance. The problem has the added constraints of finite buffers at each node and the general property that no con ..."
Abstract

Cited by 612 (8 self)
 Add to MetaCart
We formulate the delaytolerant networking routing problem, where messages are to be moved endtoend across a connectivity graph that is timevarying but whose dynamics may be known in advance. The problem has the added constraints of finite buffers at each node and the general property that no contemporaneous endtoend path may ever exist. This situation limits the applicability of traditional routing approaches that tend to treat outages as failures and seek to find an existing endtoend path. We propose a framework for evaluating routing algorithms in such environments. We then develop several algorithms and use simulations to compare their performance with respect to the amount of knowledge they require about network topology. We find that, as expected, the algorithms using the least knowledge tend to perform poorly. We also find that with limited additional knowledge, far less than complete global knowledge, efficient algorithms can be constructed for routing in such environments. To the best of our knowledge this is the first such investigation of routing issues in DTNs.
Fusion, Propagation, and Structuring in Belief Networks
 ARTIFICIAL INTELLIGENCE
, 1986
"... Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to repre ..."
Abstract

Cited by 482 (8 self)
 Add to MetaCart
Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to represent the generic knowledge of a domain expert, and it turns into a computational architecture if the links are used not merely for storing factual knowledge but also for directing and activating the data flow in the computations which manipulate this knowledge. The first part of the paper deals with the task of fusing and propagating the impacts of new information through the networks in such a way that, when equilibrium is reached, each proposition will be assigned a measure of belief consistent with the axioms of probability theory. It is shown that if the network is singly connected (e.g. treestructured), then probabilities can be updated by local propagation in an isomorphic network of parallel and autonomous processors and that the impact of new information can be imparted to all propositions in time proportional to the longest path in the network. The second part of the paper deals with the problem of finding a treestructured representation for a collection of probabilistically coupled propositions using auxiliary (dummy) variables, colloquially called "hidden causes. " It is shown that if such a treestructured representation exists, then it is possible to uniquely uncover the topology of the tree by observing pairwise dependencies among the available propositions (i.e., the leaves of the tree). The entire tree structure, including the strengths of all internal relationships, can be reconstructed in time proportional to n log n, where n is the number of leaves.
The Landscape of Parallel Computing Research: A View from Berkeley
 TECHNICAL REPORT, UC BERKELEY
, 2006
"... ..."
Data Assimilation Using an Ensemble Kalman Filter Technique
, 1998
"... The possibility of performing data assimilation using the flowdependent statistics calculated from an ensemble of shortrange forecasts (a technique referred to as ensemble Kalman filtering) is examined in an idealized environment. Using a threelevel, quasigeostrophic, T21 model and simulated ob ..."
Abstract

Cited by 411 (5 self)
 Add to MetaCart
observations, experiments are performed in a perfectmodel context. By using forward interpolation operators from the model state to the observations, the ensemble Kalman filter is able to utilize nonconventional observations. In order to
Portable High Performance Programming via ArchitectureCognizant DivideandConquer Algorithms
, 2000
"... ...................................................... xiii 1 Introduction .................................................. 1 1. DivideandConquer and the Memory Hierarchy . . . . . . . . . . . 2 2. Overview of ArchitectureCognizant Divideand Conquer . . . . . . 4 3. Overview of Napoleon . . . ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
...................................................... xiii 1 Introduction .................................................. 1 1. DivideandConquer and the Memory Hierarchy . . . . . . . . . . . 2 2. Overview of ArchitectureCognizant Divideand Conquer . . . . . . 4 3. Overview of Napoleon
2004 : Stabilization Wedges: Solving the Climate Problem for the Next 50 Years with Current Technologies
 Science
"... Humanity already possesses the fundamental scientific, technical, and industrial knowhow to solve the carbon and climate problem for the next halfcentury. A portfolio of technologies now exists to meet the world’s energy needs over the next 50 years and limit atmospheric CO2 to a trajectory that a ..."
Abstract

Cited by 280 (3 self)
 Add to MetaCart
Humanity already possesses the fundamental scientific, technical, and industrial knowhow to solve the carbon and climate problem for the next halfcentury. A portfolio of technologies now exists to meet the world’s energy needs over the next 50 years and limit atmospheric CO2 to a trajectory that avoids a doubling of the preindustrial concentration. Every element in this portfolio has passed beyond the laboratory bench and demonstration project; many are already implemented somewhere at full industrial scale. Although no element is a credible candidate for doing the entire job (or even half the job) by itself, the portfolio as a whole is large enough that not every element has to be used. The debate in the current literature about stabilizing atmospheric CO2 at less than a doubling of the preindustrial concentration has led to needless confusion about current options for mitigation. On one side, the Intergovernmental Panel on Climate Change (IPCC) has claimed that “technologies that exist in operation or pilot stage today ” are sufficient to follow a lessthandoubling trajectory “over the next hundred years or more ” [(1), p. 8]. On the other side, a recent review in Science asserts that the IPCC claim demonstrates “misperceptions of technological readiness ” and calls for “revolutionary changes ” in mitigation technology, such as fusion, spacebased solar electricity, and artificial photosynthesis (2). We agree that fundamental research is vital to develop the revolutionary mitigation strategies needed in the second half of this century and beyond. But it is important not to become beguiled by the possibility of revolutionary technology. Humanity can solve the carbon and climate problem in the first half of this century simply by scaling up what we already know how to do.
Examples of Early Nonconventional
"... Chemists have a longstanding appreciation for the value of recorded information. Many of the early efforts to improve informationprocessing techniques were centered on chemical information problems. The preeminence of Chemical Abstracts as a secondary publication service was well established, but ..."
Abstract
 Add to MetaCart
chemical information. In the late 1950s the Office of Science Information Service of the National Science Foundation initiated a series of reports (Nonconventional technical information systems in current use, 1958–1966) describing some of these innovative information systems. As principal compiler
The Dilemma of Nonconventional Monetary Policy
, 2012
"... The financial crisis of 2007 and its subsequent adverse effect on GDP deeply challenges the classical understanding of recessions provided by general equilibrium and RBC models. Indeed, while no recession preceded the large number of mortgage ..."
Abstract
 Add to MetaCart
The financial crisis of 2007 and its subsequent adverse effect on GDP deeply challenges the classical understanding of recessions provided by general equilibrium and RBC models. Indeed, while no recession preceded the large number of mortgage
Nonconventional ergodic averages
"... We study the L2convergence of two types of ergodic averages. The first is the average of a product of functions evaluated at return times along arithmetic progressions, such as the expressions appearing in Furstenberg’s proof of Szemerédi’s theorem. The second average is taken along cubes whose s ..."
Abstract
 Add to MetaCart
We study the L2convergence of two types of ergodic averages. The first is the average of a product of functions evaluated at return times along arithmetic progressions, such as the expressions appearing in Furstenberg’s proof of Szemerédi’s theorem. The second average is taken along cubes whose sizes tend to +∞. For each average, we show that it is sufficient to prove the convergence for special systems, the characteristic factors. We build these factors in a general way, independent of the type of the average. To each of these factors we associate a natural group of transformations and give them the structure of a nilmanifold. From the second convergence result we derive a combinatorial interpretation for the arithmetic structure inside a set of integers of positive upper density. 1.
Results 1  10
of
9,801