Results 1  10
of
15
Adaptive multipath routing for dynamic traffic engineering
 In Proceedings of IEEE GLOBECOM
, 2003
"... Abstract–This paper proposes Adaptive MultiPath routing (AMP) as a simple algorithm for dynamic traffic engineering within autonomous systems. In contrast to related multipath routing proposals, AMP does not employ a global perspective of the network in each node. It restricts available information ..."
Abstract

Cited by 34 (5 self)
 Add to MetaCart
Abstract–This paper proposes Adaptive MultiPath routing (AMP) as a simple algorithm for dynamic traffic engineering within autonomous systems. In contrast to related multipath routing proposals, AMP does not employ a global perspective of the network in each node. It restricts available information to a local scope, which opens the potential of reducing signaling overhead and memory consumption in routers. Having implemented AMP in ns2, the algorithm is compared to standard routing strategies for a realistic simulation scenario. The results demonstrate the stability of AMP as well as the significant performance gains achieved. I. INTRODUCTION AND RELATED WORK Efficient routing algorithms have always been among the core building blocks of any packet switching network. Whereas existing routing protocols are usually designed for
Learning in realtime search: A unifying framework
 Journal of Artificial Intelligence Research
, 2006
"... Realtime search methods are suited for tasks in which the agent is interacting with an initially unknown environment in real time. In such simultaneous planning and learning problems, the agent has to select its actions in a limited amount of time, while sensing only a local part of the environment ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Realtime search methods are suited for tasks in which the agent is interacting with an initially unknown environment in real time. In such simultaneous planning and learning problems, the agent has to select its actions in a limited amount of time, while sensing only a local part of the environment centered at the agent’s current location. Realtime heuristic search agents select actions using a limited lookahead search and evaluating the frontier states with a heuristic function. Over repeated experiences, they refine heuristic values of states to avoid infinite loops and to converge to better solutions. The wide spread of such settings in autonomous software and hardware agents has led to an explosion of realtime search algorithms over the last two decades. Not only a potential user is confronted with a hodgepodge of algorithms but also he faces the choice of control parameters they use. In this paper we address both problems. The first contribution is an introduction of a simple threeparameter framework (named LRTS) which extracts the core ideas behind many existing algorithms. We then prove that LRTA*, ɛLRTA * , SLA*, and γTrap algorithms are special cases of our framework. Thus, they are unified and extended with additional features. Second, we prove completeness and convergence of any algorithm covered by the LRTS framework. Third, we prove several upperbounds relating the control parameters and solution quality. Finally, we analyze the influence of the three control parameters empirically in the realistic scalable domains of realtime navigation on initially unknown maps from a commercial roleplaying game as well as routing in ad hoc sensor networks.
The Complexity of Ranking Hypotheses in Optimality Theory
 COMPUTATIONAL LINGUISTICS
, 2007
"... Given a constraint set with k constraints in the framework of Optimality Theory (OT), what is its capacity as a classification scheme for linguistic data? One useful measure of this capacity is the size of the largest data set of which each subset is consistent with a different grammar hypothesis. T ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Given a constraint set with k constraints in the framework of Optimality Theory (OT), what is its capacity as a classification scheme for linguistic data? One useful measure of this capacity is the size of the largest data set of which each subset is consistent with a different grammar hypothesis. This measure is known as the VapnikChervonenkis dimension (VCD) and is a standard complexity measure for concept classes in computational learnability theory. In this work, I use the threevalued logic of Elementary Ranking Conditions to show that the VCD of Optimality Theory with k constraints is k−1. Analysis of OT in terms of the VCD establishes that the complexity of OT is a well behaved function of k and that the ‘hardness’ of learning in OT is linear in k for a variety of frameworks that employ probabilistic definitions of learnability.
Geometric Shortest Path Containers
, 2004
"... In this paper, we consider Dijkstra's algorithm for the single source single target shortest path problem in large sparse graphs. The goal is to reduce the response time for online queries by using precomputed information. Due to the size of the graph, preprocessing space requirements can be onl ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In this paper, we consider Dijkstra's algorithm for the single source single target shortest path problem in large sparse graphs. The goal is to reduce the response time for online queries by using precomputed information. Due to the size of the graph, preprocessing space requirements can be only linear in the number of nodes. We assume that a layout of the graph is given. In the preprocessing, we determine from this layout a geometric object for each edge containing all nodes that can be reached by a shortest path starting with that edge.
A Genetic Algorithms Based Approach for Group Multicast Routing
"... Abstract — Whereas multicast transmission in onetomany communications allows the operator to drastically save network resources, it also makes the routing of the traffic flows more complex then in unicast transmissions. A huge amount of possible trees have to be considered and analyzed to find the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract — Whereas multicast transmission in onetomany communications allows the operator to drastically save network resources, it also makes the routing of the traffic flows more complex then in unicast transmissions. A huge amount of possible trees have to be considered and analyzed to find the appropriate routing paths. To address this problem, we propose the use of the genetic algorithms (GA), which considerably reduce the number of solutions to be evaluated. A heuristic procedure is first used to discern a set of possible trees for each multicast session in isolation. Then, the GA are applied to find the appropriate combination of the trees to comply with the bandwidth needs of the group of multicast sessions simultaneously. The goodness of each solution is assessed by means of an expression that weights both network bandwidth allocation and oneway delay. The resulting cost function is guided by few parameters that can be easily tuned during traffic engineering operations; an appropriate setting of these parameters allows the operator to configure the desired balance between network resource utilization and provided quality of service. Simulations have been performed to compare the proposed algorithm with alternative solutions in terms of bandwidth utilization and transmission delay. Index Terms — Group multicast routing; Multicast services; Genetic Algorithms.
Measuring Information Propagation and Retention in Boolean Networks and its ISSN: 11099518 Issue 2, Volume 4, February 2007
 TRANSACTIONS on BIOLOGY and BIOMEDICINE Daniel A. Charlebois, Andre S. Ribeiro, Antti Lehmussola, Jason LloydPrice, Olli YliHarja, Stuart A. Kauffman
"... Abstract: A system structure, i.e., how elements of a system are connected, is a key factor for information retention and transmission through its elements. From the system dynamics, i.e., the states of the elements over time, we measure the system’s ability to propagate information through its ele ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: A system structure, i.e., how elements of a system are connected, is a key factor for information retention and transmission through its elements. From the system dynamics, i.e., the states of the elements over time, we measure the system’s ability to propagate information through its elements as the pairwise mutual information (pMI) between the elements at moments t and t + L, where L is the minimum path length between the two elements. Information retention is measured with LempelZiv (LZ), a measure of the complexity of transmitted information, from the same time series of states. We propose a combined measure of information propagation and ability to retain information efficiently, to determine optimal structures for information propagation and retention. We present the results on information propagation and retention, as a function of topology (random and small world structures), connectivity, noise and clustering coefficient. The conclusions are applicable in any context where these networks are used to model the system. Here, we apply our findings to a model of human organizations and than propose a generalization of the model to capture more realistic features, such as more complex internal states for elements and simulating information exchange with the environment outside of the system. As more features are incorporated, this model will capture many important features of human organizations, and other complex systems.
1 A multiagent urban traffic simulation Part II: dealing with the extraordinary
"... characterized by two quantities: the magnitude (or severity) of the adverse consequences that can potentially result from the given activity or action, and by the likelihood of occurrence of the given adverse consequences. But a risk seldom exists in isolation: chain of consequences must be examined ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
characterized by two quantities: the magnitude (or severity) of the adverse consequences that can potentially result from the given activity or action, and by the likelihood of occurrence of the given adverse consequences. But a risk seldom exists in isolation: chain of consequences must be examined, as the outcome of one risk can increase the likelihood of other risks. Systemic theory must complement classic PRM. Indeed these chains are composed of many different elements, all of which may have a critical importance at many different levels. Furthermore, when urban catastrophes are envisioned, space and time constraints are key determinants of the workings and dynamics of these chains of catastrophes: models must include a correct spatial topology of the studied risk. Finally, literature insists on the importance small events can have on the risk on a greater scale: urban risks management models belong to selforganized criticality theory. We chose multiagent systems to incorporate this property in our model: the behavior of an agent can transform the dynamics of important groups of them. Index Terms — Risk management, selforganized criticality, multiagent systems, modeling, simulation.
Violation Semirings in Optimality Theory
"... This paper provides a brief algebraic characterization of constraint violations in Optimality Theory (OT). I show that if violations are taken to be multisets over a fixed basis set Con then the merge operator on multisets and a ‘min ’ operation expressed in terms of harmonic inequality provide a se ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper provides a brief algebraic characterization of constraint violations in Optimality Theory (OT). I show that if violations are taken to be multisets over a fixed basis set Con then the merge operator on multisets and a ‘min ’ operation expressed in terms of harmonic inequality provide a semiring over violation profiles. This semiring allows standard optimization algorithms to be used for OT grammars with weighted finitestate constraints in which the weights are violationmultisets. Most usefully, because multisets are unordered, the merge operation is commutative and thus it is possible to give a single graph representation of the entire class of grammars (i.e. rankings) for a given constraint set. This allows a neat factorization of the optimization problem that isolates the main source of complexity into a single constant γ denoting the size of the graph representation of the whole constraint set. I show that the computational cost of optimization is linear in the length of the underlying form with the multiplicative constant γ. This perspective thus makes it straightforward to evaluate the complexity of optimization for different constraint sets. 1
Anandabrata Pal and Nasir Memon] The Evolution of File Carving
"... [The benefits and problems of forensics recovery] ..."