Results 1  10
of
4,702,057
A scheduling model for reduced CPU energy
 ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE
, 1995
"... The energy usage of computer systems is becoming an important consideration, especially for batteryoperated systems. Various methods for reducing energy consumption have been investigated, both at the circuit level and at the operating systems level. In this paper, we propose a simple model of job s ..."
Abstract

Cited by 545 (3 self)
 Add to MetaCart
scheduling aimed at capturing some key aspects of energy minimization. In this model, each job is to be executed between its arrival time and deadline by a single processor with variable speed, under the assumption that energy usage per unit time, P, is a convex function of the processor speed s. We give
Modeling and performance analysis of bittorrentlike peertopeer networks
 In SIGCOMM
, 2004
"... In this paper, we develop simple models to study the performance of BitTorrent, a second generation peertopeer (P2P) application. We first present a simple fluid model and study the scalability, performance and efficiency of such a filesharing mechanism. We then consider the builtin incentive mec ..."
Abstract

Cited by 566 (2 self)
 Add to MetaCart
In this paper, we develop simple models to study the performance of BitTorrent, a second generation peertopeer (P2P) application. We first present a simple fluid model and study the scalability, performance and efficiency of such a filesharing mechanism. We then consider the builtin incentive
A Critical Point For Random Graphs With A Given Degree Sequence
, 2000
"... Given a sequence of nonnegative real numbers 0 ; 1 ; : : : which sum to 1, we consider random graphs having approximately i n vertices of degree i. Essentially, we show that if P i(i \Gamma 2) i ? 0 then such graphs almost surely have a giant component, while if P i(i \Gamma 2) i ! 0 the ..."
Abstract

Cited by 497 (8 self)
 Add to MetaCart
then almost surely all components in such graphs are small. We can apply these results to G n;p ; G n;M , and other wellknown models of random graphs. There are also applications related to the chromatic number of sparse random graphs.
Analysis, Modeling and Generation of SelfSimilar VBR Video Traffic
, 1994
"... We present a detailed statistical analysis of a 2hour long empirical sample of VBR video. The sample was obtained by applying a simple intraframe video compression code to an action movie. The main findings of our analysis are (1) the tail behavior of the marginal bandwidth distribution can be accu ..."
Abstract

Cited by 542 (6 self)
 Add to MetaCart
be accurately described using "heavytailed" distributions (e.g., Pareto); (2) the autocorrelation of the VBR video sequence decays hyperbolically (equivalent to longrange dependence) and can be modeled using selfsimilar processes. We combine our findings in a new (nonMarkovian) source model
A gentle tutorial on the EM algorithm and its application to parameter estimation for gaussian mixture and hidden markov models
, 1997
"... We describe the maximumlikelihood parameter estimation problem and how the Expectationform of the EM algorithm as it is often given in the literature. We then develop the EM parameter estimation procedure for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2) fi ..."
Abstract

Cited by 679 (4 self)
 Add to MetaCart
We describe the maximumlikelihood parameter estimation problem and how the Expectationform of the EM algorithm as it is often given in the literature. We then develop the EM parameter estimation procedure for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2
Measurement, Modeling, and Analysis of a PeertoPeer FileSharing Workload
, 2003
"... Peertopeer (P2P) file sharing accounts for an astonishing volume of current Internet tra#c. This paper probes deeply into modern P2P file sharing systems and the forces that drive them. By doing so, we seek to increase our understanding of P2P file sharing workloads and their implications for futu ..."
Abstract

Cited by 482 (7 self)
 Add to MetaCart
for future multimedia workloads. Our research uses a threetiered approach. First, we analyze a 200day trace of over 20 terabytes of Kazaa P2P tra#c collected at the University of Washington. Second, we develop a model of multimedia workloads that lets us isolate, vary, and explore the impact of key system
The Dantzig selector: statistical estimation when p is much larger than n
, 2005
"... In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n ≪ ..."
Abstract

Cited by 854 (14 self)
 Add to MetaCart
≪ p, and the zi’s are i.i.d. N(0, σ 2). Is it possible to estimate x reliably based on the noisy data y? To estimate x, we introduce a new estimator—we call the Dantzig selector—which is solution to the ℓ1regularization problem min ˜x∈R p ‖˜x‖ℓ1 subject to ‖A T r‖ℓ ∞ ≤ (1 + t −1) √ 2 log p · σ
A Framework for Uplink Power Control in Cellular Radio Systems
 IEEE Journal on Selected Areas in Communications
, 1996
"... In cellular wireless communication systems, transmitted power is regulated to provide each user an acceptable connection by limiting the interference caused by other users. Several models have been considered including: (1) fixed base station assignment where the assignment of users to base stations ..."
Abstract

Cited by 631 (18 self)
 Add to MetaCart
stations is fixed, (2) minimum power assignment where a user is iteratively assigned to the base station at which its signal to interference ratio is highest, and (3) diversity reception, where a user's signal is combined from several or perhaps all base stations. For the above models, the uplink
Bayesian Data Analysis
, 1995
"... I actually own a copy of Harold Jeffreys’s Theory of Probability but have only read small bits of it, most recently over a decade ago to confirm that, indeed, Jeffreys was not too proud to use a classical chisquared pvalue when he wanted to check the misfit of a model to data (Gelman, Meng and Ste ..."
Abstract

Cited by 2127 (60 self)
 Add to MetaCart
I actually own a copy of Harold Jeffreys’s Theory of Probability but have only read small bits of it, most recently over a decade ago to confirm that, indeed, Jeffreys was not too proud to use a classical chisquared pvalue when he wanted to check the misfit of a model to data (Gelman, Meng
The hierarchy problem and new dimensions at a millimeter
, 2008
"... We propose a new framework for solving the hierarchy problem which does not rely on either supersymmetry or technicolor. In this framework, the gravitational and gauge interactions become united at the weak scale, which we take as the only fundamental short distance scale in nature. The observed wea ..."
Abstract

Cited by 660 (5 self)
 Add to MetaCart
weakness of gravity on distances> ∼ 1 mm is due to the existence of n ≥ 2 new compact spatial dimensions large compared to the weak scale. The Planck scale MPl ∼ G −1/2 N is not a fundamental scale; its enormity is simply a consequence of the large size of the new dimensions. While gravitons can freely
Results 1  10
of
4,702,057