Results 11  20
of
63
Conjunctive search for one and two identical targets
 JOURNAL OF EXPERIMENTAL PSYCHOLOGY: HUMAN PERCEPTION AND PERFORMANCE
, 1989
"... The assumptions of feature integration theory as a blind, serial, selfterminating search (SSTS) mechanism are extended to displays containing 2 identical targets.The SSTS predicts no differences in negativeresponse displays, which require an exhaustive search of the display. Quantitative predictio ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
The assumptions of feature integration theory as a blind, serial, selfterminating search (SSTS) mechanism are extended to displays containing 2 identical targets.The SSTS predicts no differences in negativeresponse displays, which require an exhaustive search of the display. Quantitative predictions are confirmed for the positive responses, but not for the negatives, suggesting that the SSTS model is incorrect. Two possible explanations for the results in the negative conditions, differential search rates and early quitting in the negatives, are rejected. It is suggested that using any selfterminating search mechanism will lead to difficulty in interpreting the results, including accounts for which the search is parallel over small groups of items. A resourcelimited parallel model, which is based on the diffusion model of Ratcliff (1978), appears to fit the data well.
Measures of Distinctness for Random Partitions and Compositions of an Integer
, 1997
"... This paper is concerned with problems of the following type: # Accepted for publication in Advances in Applied Mathematics. Given a random (under a suitable probability model) partition or composition, study quantitatively the measures of the degree of distinctness of its parts ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
This paper is concerned with problems of the following type: # Accepted for publication in Advances in Applied Mathematics. Given a random (under a suitable probability model) partition or composition, study quantitatively the measures of the degree of distinctness of its parts
Improving ParallelDisk Buffer Management using Randomized Writeback
 Proc. Int’l Conf. Parallel Processing
, 1998
"... We address the problems of I/O scheduling and buffer management for general reference strings in a parallel I/O system. Using the standard parallel disk model with D disks and a shared I/O buffer of size M, we study the performance of online algorithms that use bounded global Mblock lookahead. We ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
We address the problems of I/O scheduling and buffer management for general reference strings in a parallel I/O system. Using the standard parallel disk model with D disks and a shared I/O buffer of size M, we study the performance of online algorithms that use bounded global Mblock lookahead. We introduce the concept of writeback whereby blocks are dynamically relocated between disks during the course of the computation. Writeback allows the layout to be altered to suit different access patterns in different parts of the reference string. We show that any boundedlookahead online algorithm that uses purely deterministic policies must have a competitive ratio of (D). We show how to improve the performance by using randomization, and present a novel algorithm, RANDWB, using a randomized writeback scheme. RANDWB has a competitive ratio of ( p D), which is the best achievable by any online algorithm with only global Mblock lookahead. If the initial layout of data on the disks is uniformly random, RANDWB has a competitive ratio of (log D). 1.
Generalizations of Polya’s urn problem
 Annals of Combinatorics
, 2003
"... Abstract. We consider generalizations of the classical Polya urn problem: Given finitely many bins each containing one ball, suppose that additional balls arrive one at a time. For each new ball, with probability p, create a new bin and place the ball in that bin; with probability 1 − p, place the b ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Abstract. We consider generalizations of the classical Polya urn problem: Given finitely many bins each containing one ball, suppose that additional balls arrive one at a time. For each new ball, with probability p, create a new bin and place the ball in that bin; with probability 1 − p, place the ball in an existing bin, such that the probability the ball is placed in a bin is proportional to m γ,wheremis the number of balls in that bin. For p = 0, the number of bins is fixed and finite, and the behavior of the process depends on whether γ is greater than, equal to, or less than 1. We survey the known results and give new proofs for all three cases. We then consider the case p>0. When γ = 1, this is equivalent to the socalled preferential attachment scheme which leads to power law distribution for bin sizes. When γ>1, we prove that a single bin dominates, i.e., as the number of balls goes to infinity, the probability converges to 1 that any new ball either goes into that bin or creates a new bin. When p>0andγ<1, we show that under the assumption that certain limits exist, the fraction of bins having m balls shrinks exponentially as a function of m. We then discuss further generalizations and pose several open problems.
Marriage, honesty, and stability
 In Proceedings of the Sixteenth Annual ACMSIAM Symposium on Discrete Algorithms (SODA
, 2005
"... Many centralized twosided markets form a matching between participants by running a stable marriage algorithm. It is a wellknown fact that no matching mechanism based on a stable marriage algorithm can guarantee truthfulness as a dominant strategy for participants. However, as we will show in this ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Many centralized twosided markets form a matching between participants by running a stable marriage algorithm. It is a wellknown fact that no matching mechanism based on a stable marriage algorithm can guarantee truthfulness as a dominant strategy for participants. However, as we will show in this paper, in a probabilistic setting where the preference lists of one side of the market are composed of only a constant (independent of the the size of the market) number of entries, each drawn from an arbitrary distribution, the number of participants that have more than one stable partner is vanishingly small. This proves (and generalizes) a conjecture of Roth and Peranson [23]. As a corollary Ó of this result, we show that, with high probability, the truthful strategy is the best response for a given player when the other players are truthful. We also analyze equilibria of the deferred acceptance stable marriage game. We show that the game with complete information has an equilibrium in which a fraction of the strategies are truthful in expectation. In the more realistic setting of a game of incomplete information, we will show that the set of truthful strategies form a Ó
Decentralized algorithms using both local and random probes for p2p load balancing
 In Seventeenth ACM Symposium on Parallelism in Algorithms and Architectures (SPAA
, 2005
"... We study randomized algorithms for placing a sequence of n nodes on a circle with unit perimeter. Nodes divide the circle into disjoint arcs. We desire that a newlyarrived node (which is oblivious of its index in the sequence) choose its position on the circle by learning the positions of as few ex ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We study randomized algorithms for placing a sequence of n nodes on a circle with unit perimeter. Nodes divide the circle into disjoint arcs. We desire that a newlyarrived node (which is oblivious of its index in the sequence) choose its position on the circle by learning the positions of as few existing nodes as possible. At the same time, we desire that that the variation in arclengths be small. To this end, we propose a new algorithm that works as follows: The k th node chooses r random points on the circle, inspects the sizes of v arcs in the vicinity of each random point, and places itself at the midpoint of the largest arc encountered. We show that for any combination of r and v satisfying rv ≥ c log k, where c is a small constant, the ratio of the largest to the smallest arclength is at most eight w.h.p., for an arbitrarily long
A stochastic evolutionary model exhibiting powerlaw behaviour with an exponential cutoff
 in the Condensed Matter Archive, condmat/0209463
, 2005
"... Recently several authors have proposed stochastic evolutionary models for the growth of complex networks that give rise to powerlaw distributions. These models are based on the notion of preferential attachment leading to the “rich get richer ” phenomenon. Despite the generality of the proposed sto ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Recently several authors have proposed stochastic evolutionary models for the growth of complex networks that give rise to powerlaw distributions. These models are based on the notion of preferential attachment leading to the “rich get richer ” phenomenon. Despite the generality of the proposed stochastic models, there are still some unexplained phenomena, which may arise due to the limited size of networks such as protein and email networks. Such networks may in fact exhibit an exponential cutoff in the powerlaw scaling, although this cutoff may only be observable in the tail of the distribution for extremely large networks. We propose a modification of the basic stochastic evolutionary model, so that after, for example, a node is chosen preferentially, say according to the number of its inlinks, there is a small probability that this node will be discarded. We show that as a result of this modification, by viewing the stochastic process in terms of an urn transfer model, we obtain a powerlaw distribution with an exponential cutoff. Unlike many other models, the current model can capture instances where the exponent of the distribution is less than or equal to two. As a proof of concept, we demonstrate the consistency of our model by analysing the protein yeast network, whose distribution is known to follow a power law with an exponential cutoff. 1
Probabilistic Pursuits on the Grid
"... this paper continue to hold when the lag \Delta is not held constant, but is allowed to vary from one ant to the next. We could also allow for the chasing ant to be guided by an ant other than the one immediately ahead. To achieve the asymptotic results, we need only ensure that eventually the curre ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
this paper continue to hold when the lag \Delta is not held constant, but is allowed to vary from one ant to the next. We could also allow for the chasing ant to be guided by an ant other than the one immediately ahead. To achieve the asymptotic results, we need only ensure that eventually the current ant is many generations removed from the first one. Also we need to have \Delta 2 infinitely often at each stage of the walk. The results discussed in this paper can be generalized to three (or more) dimensional space. The probability of A n+1 moving along each axis will, in this case, be proportional to the projection of the vector A n \Gamma A n+1 along this axis. Ants obeying the probabilistic pursuit model have the property of moving, on the average, in the same direction as a continuous pursuit. However, their speed is not constant since it depends on the location of the chaser relative to the target. To overcome this problem, for purposes of approximating continuous pursuit, one might consider the following Euclidean probabilistic rule of pursuit:
The evolution of connectionist networks
 in Artificial Intelligence and
, 1994
"... In this paper we present a study of the mixing time of a random walk on the largest component of a supercritical random graph, also known as the giant component. We identify local obstructions that slow down the random walk, when the average degree d is at most √ ln n ln ln n, proving that the mixin ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In this paper we present a study of the mixing time of a random walk on the largest component of a supercritical random graph, also known as the giant component. We identify local obstructions that slow down the random walk, when the average degree d is at most √ ln n ln ln n, proving that the mixing time in this case is O((ln n/d) 2) asymptotically almost surely. As the average degree grows these become negligible and it is the diameter of the largest component that takes over, yielding mixing time O(ln n / ln d). We proved these results during the 200304 academic year. Similar results but for constant d were later proved independently by I. Benjamini, G. Kozma and N. Wormald in [3]. 1