Results 1  10
of
222
and Resource Economics, U.C. Berkeley
"... necessarily represent the views of the Energy Commission, Cal/EPA, their employees, or the State of California. The Energy Commission, Cal/EPA, the State of California, their employees, contractors, and subcontractors make no warrant, express or implied, and assume no legal liability for the informa ..."
Abstract
 Add to MetaCart
Municipal Utility District (SMUD), and one anonymous reviewer for their useful comments and review. Funding for this project came from the California Climate Change Center at U.C. Berkeley. i Preface The Public Interest Energy Research (PIER) Program supports public interest energy research
at College Math: U. C. Berkeley’s Professional
, 2006
"... Note: We thank the various members of the U.C. Berkeley Committee on Student Diversity and Academic Development and particularly Caroline Kane for advise and helpful comments ..."
Abstract
 Add to MetaCart
Note: We thank the various members of the U.C. Berkeley Committee on Student Diversity and Academic Development and particularly Caroline Kane for advise and helpful comments
U.C. Berkeley Web Design Patterns Library
, 2006
"... We thank Owen Otto for his contributions to this project. We particularly thank our ..."
Abstract
 Add to MetaCart
We thank Owen Otto for his contributions to this project. We particularly thank our
U.C. Berkeley Handout N21
, 2006
"... The paging problem is the following. We have a slow memory with N distinct pages and a fast memory, also referred to as cache, that can contain at most k pages, where k < N. An unknown sequence σ1, σ2,... of page requests is received. If a requested page σ is in the fast memory at the time of req ..."
Abstract
 Add to MetaCart
The paging problem is the following. We have a slow memory with N distinct pages and a fast memory, also referred to as cache, that can contain at most k pages, where k < N. An unknown sequence σ1, σ2,... of page requests is received. If a requested page σ is in the fast memory at the time of request, no cost is incurred. If the requested page is not in the cache, a page must be evicted from the cache in order to bring in page σ. In the last lecture, we saw several deterministic schemes for paging. We, also, introduced a family of deterministic algorithms, the marking algorithms, which achieve competitive ratio of k. Finally, we showed a lower bound of k for the competitive ratio of any deterministic algorithm, thus establishing that the marking algorithms are optimal deterministic algorithms. In this lecture we consider the performance of randomized paging algorithms, in which the mechanics of the page evictions are not deterministic. Having such an algorithm in hands, there are two types of adversaries against which we could test its performance: • Oblivious Adversaries which know the code of the algorithm and, based on this information only, must commit to a request sequence. • Adaptive Adversaries which not only know the code of the algorithm but, moreover, can see
Global Metropolitan Studies, U.C. Berkeley
, 2009
"... 4,746 words plus 11 figures and tablesMillardBall & Schipper 2 Projections of energy use and greenhouse gas emissions for industrialized countries typically show continued growth in vehicle ownership, vehicle use and overall travel demand. This represents a continuation of trends from the 1970s ..."
Abstract
 Add to MetaCart
4,746 words plus 11 figures and tablesMillardBall & Schipper 2 Projections of energy use and greenhouse gas emissions for industrialized countries typically show continued growth in vehicle ownership, vehicle use and overall travel demand. This represents a continuation of trends from the 1970s through the early 2000s. This paper presents a descriptive analysis of crossnational passenger transport trends in six industrialized countries, providing evidence to suggest that these trends may have halted. Through decomposing passenger transport energy use into activity, mode structure and energy intensity, we show that increases in total activity (passenger travel) have been the driving force behind increased energy use, offset somewhat by declining energy intensity. We show that total activity growth has halted relative to GDP in recent years in the six countries examined. If these trends continue, it is possible that accelerated decline in the energy intensity of car travel; stagnation in total travel per capita; some shifts back to rail and bus modes; and at least somewhat less carbon per unit of energy could leave the absolute levels of emissions in 2020 or 2030 lower than today. 1 2 3
U.C. Berkeley — CS278: Computational Complexity Handout N5
"... In this lecture we prove the KarpLipton theorem that if all NP problems have polynomial size circuits then the polynomial hierarchy collapses. The next result we wish to prove is that all approximate combinatorial counting problem can be solved within the polynomial hierarchy. Before introducing co ..."
Abstract
 Add to MetaCart
the circuits C 1 n,...,C n n as follows: • C 1 n, on input a formula ϕ over n variables outputs 1 if and only if there is a satisfying assignment for ϕ where x1 = 1, • C i n, on input a formula ϕ over n variables and bits b1,...,bi−1, outputs 1 if and only if there is a satisfying assignment for ϕ where x1 = b
U.C. Berkeley — CS278: Computational Complexity Handout N1
, 2004
"... This course assumes CS170, or equivalent, as a prerequisite. We will assume that the reader is familiar with the notions of algorithm and running time, as well as with basic notions of discrete math and probability. We will occasionally refer to Turing machines, especially in this lecture. A main ob ..."
Abstract
 Add to MetaCart
This course assumes CS170, or equivalent, as a prerequisite. We will assume that the reader is familiar with the notions of algorithm and running time, as well as with basic notions of discrete math and probability. We will occasionally refer to Turing machines, especially in this lecture. A main objective of theoretical computer science is to understand the amount of resources (time, memory, communication, randomness,...) needed to solve computational problems that we care about. While the design and analysis of algorithms puts upper bounds on such amounts, computational complexity theory is mostly concerned with lower bounds; that is we look for negative results showing that certain problems require a lot of time, memory, etc., to be solved. In particular, we are interested in infeasible problems, that is computational problems that require impossibly large resources to be solved, even on instances of moderate size. It is very hard to show that a particular problem is infeasible, and in fact for a lot of interesting problems the question of their feasibility is still open. Another major line of work in complexity is in understanding the relations between different computational problems and between different “modes ” of computation. For example what is the relative power of algorithms using randomness and deterministic algorithms, what is the relation between worstcase and averagecase complexity, how easier can we make an optimization problem if we only look for approximate solutions, and so on. It is in this direction that we find the most beautiful, and often surprising, known results in complexity theory. Before going any further, let us be more precise in saying what a computational problem is, and let us define some important classes of computational problems. Then we will see a particular incarnation of the notion of “reduction, ” the main tool in complexity theory, and we will introduce NPcompleteness, one of the great success stories of complexity theory. We conclude by demonstrating the use of diagonalization to show some separations between complexity classes. It is unlikely that such techniques will help solving the P versus NP problem. 1
U.C. Berkeley — CS278: Computational Complexity Handout N1
, 2008
"... This course assumes CS170, or equivalent, as a prerequisite. We will assume that the reader is familiar with the notions of algorithm and running time, as well as with basic notions of discrete math and probability. We will occasionally refer to Turing machines, especially in this lecture. A main ob ..."
Abstract
 Add to MetaCart
This course assumes CS170, or equivalent, as a prerequisite. We will assume that the reader is familiar with the notions of algorithm and running time, as well as with basic notions of discrete math and probability. We will occasionally refer to Turing machines, especially in this lecture. A main objective of theoretical computer science is to understand the amount of resources (time, memory, communication, randomness,...) needed to solve computational problems that we care about. While the design and analysis of algorithms puts upper bounds on such amounts, computational complexity theory is mostly concerned with lower bounds; that is we look for negative results showing that certain problems require a lot of time, memory, etc., to be solved. In particular, we are interested in infeasible problems, that is computational problems that require impossibly large resources to be solved, even on instances of moderate size. It is very hard to show that a particular problem is infeasible, and in fact for a lot of interesting problems the question of their feasibility is still open. Another major line of work in complexity is in understanding the relations between different computational problems and between different “modes ” of computation. For example what is the relative power of algorithms using randomness and deterministic algorithms, what is the relation between worstcase and averagecase complexity, how easier can we make an optimization problem if we only look for approximate solutions, and so on. It is in this direction that we find the most beautiful, and often surprising, known results in complexity theory. Before going any further, let us be more precise in saying what a computational problem is, and let us define some important classes of computational problems. Then we will see a particular incarnation of the notion of “reduction, ” the main tool in complexity theory, and we will introduce NPcompleteness, one of the great success stories of complexity theory. We conclude by demonstrating the use of diagonalization to show some separations between complexity classes. It is unlikely that such techniques will help solving the P versus NP problem. 1
The OASIS Group at U.C. Berkeley: Research Summary and Future Directions Document Scope
, 2003
"... This document is a forwardlooking summary of the research and activities of the U.C. Berkeley OASIS group. Here we present a draft architecture for implementing applicationspecific "in the network " functionality within the coming generation of programmable network elements (PNEs ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This document is a forwardlooking summary of the research and activities of the U.C. Berkeley OASIS group. Here we present a draft architecture for implementing applicationspecific "in the network " functionality within the coming generation of programmable network elements
The Parallel Computing Laboratory at U.C. Berkeley: A Research Agenda Based on the Berkeley View
, 2008
"... Copyright © 2008, by the author(s). All rights reserved. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Copyright © 2008, by the author(s). All rights reserved. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission.
Results 1  10
of
222