Results 1  10
of
11
A largescale study on predicting and contextualizing building energy usage
, 2011
"... Abstract In this paper we present a datadriven approach to modeling end user energy consumption in residential and commercial buildings. Our model is based upon a data set of monthly electricity and gas bills, collected by a utility over the course of several years, for approximately 6,500 buildin ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
(Show Context)
Abstract In this paper we present a datadriven approach to modeling end user energy consumption in residential and commercial buildings. Our model is based upon a data set of monthly electricity and gas bills, collected by a utility over the course of several years, for approximately 6,500 buildings in Cambridge, MA. In addition, we use publicly available tax assessor records and geographical survey information to determine corresponding features for the buildings. Using both parametric and nonparametric learning methods, we learn models that predict distributions over energy usage based upon these features, and use these models to develop two enduser systems. For utilities or authorized institutions (those who may obtain access to the full data) we provide a system that visualizes energy consumption for each building in the city; this allows companies to quickly identify outliers (buildings which use much more energy than expected even after conditioning on the relevant predictors), for instance allowing them to target homes for potential retrofits or tiered pricing schemes. For other end users, we provide an interface for entering their own electricity and gas usage, along with basic information about their home, to determine how their consumption compares to that of similar buildings as predicted by our model. Merely allowing users to contextualize their consumption in this way, relating it to the consumption in similar buildings, can itself produce behavior changes to significantly reduce consumption.
Predictive entropy search for efficient global optimization of blackbox functions
 In Advances in Neural Information Processing Systems
, 2014
"... We propose a novel informationtheoretic approach for Bayesian optimization called Predictive Entropy Search (PES). At each iteration, PES selects the next evaluation point that maximizes the expected information gained with respect to the global maximum. PES codifies this intractable acquisition fu ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
We propose a novel informationtheoretic approach for Bayesian optimization called Predictive Entropy Search (PES). At each iteration, PES selects the next evaluation point that maximizes the expected information gained with respect to the global maximum. PES codifies this intractable acquisition function in terms of the expected reduction in the differential entropy of the predictive distribution. This reformulation allows PES to obtain approximations that are both more accurate and efficient than other alternatives such as Entropy Search (ES). Furthermore, PES can easily perform a fully Bayesian treatment of the model hyperparameters while ES cannot. We evaluate PES in both synthetic and realworld applications, including optimization problems in machine learning, finance, biotechnology, and robotics. We show that the increased accuracy of PES leads to significant gains in optimization performance. 1
MLSP 2014 Schizophrenia Classification Challenge: Winning Model Documentation
"... This technical note presents the idea and methods behind the winning solution for the MLSP 2014 Schizophrenia Classification Challenge organized on Kaggle. This challenge took place between June 5 and July 20, 2014, and 341 teams submitted solutions. The winning model ‘Solution Draft ’ was based on ..."
Abstract
 Add to MetaCart
(Show Context)
This technical note presents the idea and methods behind the winning solution for the MLSP 2014 Schizophrenia Classification Challenge organized on Kaggle. This challenge took place between June 5 and July 20, 2014, and 341 teams submitted solutions. The winning model ‘Solution Draft ’ was based on a Bayesian machine learning paradigm known as Gaussian process (GP) classification. 1 Summary The goal of the competition [1] was to automatically diagnose subjects with schizophrenia based on multimodal features derived from their magnetic resonance imaging (MRI) brain scans. The winning proposition was based on a Gaussian process (GP, [2]) classifier, where the observations are considered to be drawn from a Bernoulli distribution. The probability is related to the latent function via a sigmoid function that transforms it to a unit interval. A GP prior with a covariance function as a sum of a constant, linear, and Matérn kernel was placed over the latent functions. The model was trained by sampling using the GPSTUFF toolbox [3].
A Gaussian processes model for survival analysis with time dependent
"... covariates and interval censoring ..."
Approximating Crossvalidatory Predictive Evaluation in Bayesian Latent Variables Models with Integrated
, 2014
"... Abstract: A natural method for approximating outofsample predictive evaluation is leaveoneout crossvalidation (LOOCV) — we alternately hold out each case from a full data set and then train a Bayesian model using Markov chain Monte Carlo (MCMC) without the heldout; at last we evaluate the pos ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: A natural method for approximating outofsample predictive evaluation is leaveoneout crossvalidation (LOOCV) — we alternately hold out each case from a full data set and then train a Bayesian model using Markov chain Monte Carlo (MCMC) without the heldout; at last we evaluate the posterior predictive distribution of all cases with their actual observations. However, actual LOOCV is timeconsuming. This paper introduces two methods, namely iIS and iWAIC, for approximating LOOCV with only Markov chain samples simulated from a posterior based on a full data set. iIS and iWAIC aim at improving the approximations given by importance sampling (IS) and WAIC in Bayesian models with possibly correlated latent variables. In iIS and iWAIC, we first integrate the predictive density over the distribution of the latent variables associated with the heldout without reference to its observation, then apply IS and WAIC approximations to the integrated predictive density. We compare iIS and iWAIC with other approximation methods in three real data examples that respectively use mixture models, models with correlated spatial effects, and a random effect logistic model. Our empirical results show that iIS and iWAIC give substantially better approximates than nonintegrated IS and WAIC and other methods.
Approximating Crossvalidatory Predictive Evaluation in Bayesian Latent Variables Models with Integrated IS and WAIC
, 2014
"... Abstract: A natural method for approximating outofsample predictive evaluation is leaveoneout crossvalidation (LOOCV) — we alternately hold out each case from a full data set and then train a Bayesian model using Markov chain Monte Carlo (MCMC) without the heldout; at last we evaluate the pos ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: A natural method for approximating outofsample predictive evaluation is leaveoneout crossvalidation (LOOCV) — we alternately hold out each case from a full data set and then train a Bayesian model using Markov chain Monte Carlo (MCMC) without the heldout; at last we evaluate the posterior predictive distribution of all cases with their actual observations. However, actual LOOCV is timeconsuming. This paper introduces two methods, namely iIS and iWAIC, for approximating LOOCV with only Markov chain samples simulated from a posterior based on a full data set. iIS and iWAIC aim at improving the approximations given by importance sampling (IS) and WAIC in Bayesian models with possibly correlated latent variables. In iIS and iWAIC, we first integrate the predictive density over the distribution of the latent variables associated with the heldout without reference to its observation, then apply IS and WAIC approximations to the integrated predictive density. We compare iIS and iWAIC with other approximation methods in three real data examples that respectively use mixture models, models with correlated spatial effects, and a random effect logistic model. Our empirical results show that iIS and iWAIC give substantially better approximates than nonintegrated IS and WAIC and other methods.
Faculty of Biological and Environmental Sciences
"... Steps towards comprehensive Bayesian decision analysis in fisheries and environmental management ..."
Abstract
 Add to MetaCart
(Show Context)
Steps towards comprehensive Bayesian decision analysis in fisheries and environmental management
cs.helsinki.fi
"... Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for comp ..."
Abstract
 Add to MetaCart
(Show Context)
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request
RESEARCH ARTICLE Open Access
"... The effectiveness of physical activity monitoring and distance counseling in an occupational setting – Results from a randomized controlled trial (CoAct) ..."
Abstract
 Add to MetaCart
(Show Context)
The effectiveness of physical activity monitoring and distance counseling in an occupational setting – Results from a randomized controlled trial (CoAct)