Results 1  10
of
241
Simultaneous Tracking & Activity Recognition (STAR) Using Many Anonymous, Binary Sensors
, 2004
"... Automatic health monitoring helps enable independent living for the elderly by providing specific information to caregivers. This goal, called aging in place,is increasingly important as an unprecedented portion of the population enters old age. I introduce the simultaneous tracking and activity rec ..."
Abstract

Cited by 61 (1 self)
 Add to MetaCart
Automatic health monitoring helps enable independent living for the elderly by providing specific information to caregivers. This goal, called aging in place,is increasingly important as an unprecedented portion of the population enters old age. I introduce the simultaneous tracking and activity recognition (STAR) problem,whose solution provides this key information. I propose using data from many minimally invasive sensors commonly found in home security systems to provide simultaneous roomlevel tracking and recognition of many of the activities of daily living (ADLs). ADLs have been chosen by physicians to gauge the severity of cognitive and physical ailments. I describe a RaoBlackwellised particle filter for room level tracking, rudimentary activity recognition, and data association, as well as a Monte Carlo EM approach for online parameter learning. I demonstrate results from experiments in an instrumented home and on simulated data. Proposed extensions improve the approach and add more complex activity recognition. We discuss how to integrate a growing vocabulary of activities into the tracker.
Comparative Studies Of Metamodeling Techniques Under Multiple Modeling Criteria
 Structural and Multidisciplinary Optimization
, 2000
"... 1 Despite the advances in computer capacity, the enormous computational cost of complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimization. To cut down the cost, surrogate models, also known as metamodels, are constructed from and ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
1 Despite the advances in computer capacity, the enormous computational cost of complex engineering simulations makes it impractical to rely exclusively on simulation for the purpose of design optimization. To cut down the cost, surrogate models, also known as metamodels, are constructed from and then used in lieu of the actual simulation models. In the paper, we systematically compare four popular metamodeling techniquesPolynomial Regression, Multivariate Adaptive Regression Splines, Radial Basis Functions, and Krigingbased on multiple performance criteria using fourteen test problems representing different classes of problems. Our objective in this study is to investigate the advantages and disadvantages these four metamodeling techniques using multiple modeling criteria and multiple test problems rather than a single measure of merit and a single test problem. 1 Introduction Simulationbased analysis tools are finding increased use during preliminary design to explore desi...
A Comparison Of Approximation Modeling Techniques: Polynomial Versus Interpolating Models
, 1998
"... Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimiz ..."
Abstract

Cited by 44 (10 self)
 Add to MetaCart
Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimization problems. The first approximation model is a quadratic polynomial created using the method of least squares. This type of polynomial model has seen considerable use in recent engineering optimization studies due to its computational simplicity and ease of use. However, quadratic polynomial models may be of limited accuracy when the response data to be modeled have multiple local extrema. The second approximation model employs an interpolation scheme known as kriging developed in the fields of spatial statistics and geostatistics. This class of interpolating model has the flexibility to model response data with multiple local extrema. However, this flexibility is obtained at an increase...
Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations
, 2002
"... ..."
Numerical optimization using computer experiments
 Institute for Computer
, 1997
"... Engineering design optimization often gives rise to problems in which expensive objective functions are minimized by derivativefree methods. We propose a method for solving such problems that synthesizes ideas from the numerical optimization and computer experiment literatures. Our approach relies ..."
Abstract

Cited by 27 (9 self)
 Add to MetaCart
Engineering design optimization often gives rise to problems in which expensive objective functions are minimized by derivativefree methods. We propose a method for solving such problems that synthesizes ideas from the numerical optimization and computer experiment literatures. Our approach relies on kriging known function values to construct a sequence of surrogate models of the objective function that are used to guide a grid search for a minimizer. Results from numerical experiments on a standard test problem are presented.
Statistical strategies for avoiding false discoveries in metabolomics and related experiments
, 2006
"... Many metabolomics, and other highcontent or highthroughput, experiments are set up such that the primary aim is the discovery of biomarker metabolites that can discriminate, with a certain level of certainty, between nominally matched ‘case ’ and ‘control ’ samples. However, it is unfortunately ve ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Many metabolomics, and other highcontent or highthroughput, experiments are set up such that the primary aim is the discovery of biomarker metabolites that can discriminate, with a certain level of certainty, between nominally matched ‘case ’ and ‘control ’ samples. However, it is unfortunately very easy to find markers that are apparently persuasive but that are in fact entirely spurious, and there are wellknown examples in the proteomics literature. The main types of danger are not entirely independent of each other, but include bias, inadequate sample size (especially relative to the number of metabolite variables and to the required statistical power to prove that a biomarker is discriminant), excessive false discovery rate due to multiple hypothesis testing, inappropriate choice of particular numerical methods, and overfitting (generally caused by the failure to perform adequate validation and crossvalidation). Many studies fail to take these into account, and thereby fail to discover anything of true significance (despite their claims). We summarise these problems, and provide pointers to a substantial existing literature that should assist in the improved design and evaluation of metabolomics experiments, thereby allowing robust scientific conclusions to be drawn from the available data. We provide a list of some of the simpler checks that might improve one’s confidence that a candidate biomarker is not simply a statistical artefact, and suggest a series of preferred tests and visualisation tools that can assist readers and authors in assessing papers. These tools can be applied to individual metabolites by using multiple univariate tests performed in parallel across all metabolite peaks. They may also be applied to the validation of multivariate models. We stress in
Sampling Strategies for Computer Experiments: Design and Analysis
, 2001
"... Computerbased simulation and analysis is used extensively in engineering for a variety of tasks. Despite the steady and continuing growth of computing power and speed, the computational cost of complex highfidelity engineering analyses and simulations limit their use in important areas like design ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Computerbased simulation and analysis is used extensively in engineering for a variety of tasks. Despite the steady and continuing growth of computing power and speed, the computational cost of complex highfidelity engineering analyses and simulations limit their use in important areas like design optimization and reliability analysis. Statistical approximation techniques such as design of experiments and response surface methodology are becoming widely used in engineering to minimize the computational expense of running such computer analyses and circumvent many of these limitations. In this paper, we compare and contrast five experimental design types and four approximation model types in terms of their capability to generate accurate approximations for two engineering applications with typical engineering behaviors and a wide range of nonlinearity. The first example involves the analysis of a twomember frame that has three input variables and three responses of interest. The second example simulates the rollover potential of a semitractortrailer for different combinations of input variables and braking and steering levels. Detailed error analysis reveals that uniform designs provide good sampling for generating accurate approximations using different sample sizes while kriging models provide accurate approximations that are robust for use with a variety of experimental designs and sample sizes.
Robust analog/RF circuit design with projectionbased posynomial modeling
 IEEE/ACM ICCAD
, 2004
"... In this paper we propose a RObust Analog Design tool (ROAD) for posttuning analog/RF circuits. Starting from an initial design derived from hand analysis or analog circuit synthesis based on simplified models, ROAD extracts accurate posynomial performance models via transistorlevel simulation and ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
In this paper we propose a RObust Analog Design tool (ROAD) for posttuning analog/RF circuits. Starting from an initial design derived from hand analysis or analog circuit synthesis based on simplified models, ROAD extracts accurate posynomial performance models via transistorlevel simulation and optimizes the circuit by geometric programming. Importantly, ROAD sets up all design constraints to include largescale process variations to facilitate the tradeoff between yield and performance. A novel convex formulation of the robust design problem is utilized to improve the optimization efficiency and to produce a solution that is superior to other local tuning methods. In addition, a novel projectionbased approach for posynomial fitting is used to facilitate scaling to large problem sizes. A new implicit power iteration algorithm is proposed to find the optimal projection space and extract the posynomial coefficients with robust convergence. The efficacy of ROAD is demonstrated on several circuit examples. 1.
AIRCRAFT MULTIDISCIPLINARY DESIGN OPTIMIZATION USING DESIGN OF EXPERIMENTS THEORY AND RESPONSE SURFACE MODELING METHODS
, 1997
"... Design engineers often employ numerical optimization techniques to assist in the evaluation and comparison of new aircraft configurations. While the use of numerical optimization methods is largely successful, the presence of numerical noise in realistic engineering optimization problems often inhib ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
Design engineers often employ numerical optimization techniques to assist in the evaluation and comparison of new aircraft configurations. While the use of numerical optimization methods is largely successful, the presence of numerical noise in realistic engineering optimization problems often inhibits the use of many gradientbased optimization techniques. Numerical noise causes inaccurate gradient calculations which in turn slows or prevents convergence during optimization. The problems created by numerical noise are particularly acute in aircraft design applications where a single aerodynamic or structural analysis of a realistic aircraft configuration may require tens of CPU hours on a supercomputer. The computational expense of the analyses coupled with the convergence difficulties created by numerical noise are significant obstacles to performing aircraft multidisciplinary design optimization. To address these issues, a procedure has been developed to create two types of noisefree mathematical models for use in aircraft optimization studies. These two methods use elements of statistical analysis and the overall procedure for using the methods is made computationally affordable by the application of parallel computing techniques. The first
Projectionbased performance modeling for inter/intradie variations
 in Proc. IEEE/ACM Int. Conf. Comput.Aided Des., 2005
, 2005
"... Largescale process fluctuations in nanoscale IC technologies suggest applying highorder (e.g., quadratic) response surface models to capture the circuit performance variations. Fitting such models requires significantly more simulation samples and solving much larger linear equations. In this pap ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
Largescale process fluctuations in nanoscale IC technologies suggest applying highorder (e.g., quadratic) response surface models to capture the circuit performance variations. Fitting such models requires significantly more simulation samples and solving much larger linear equations. In this paper, we propose a novel projectionbased extraction approach, PROBE, to efficiently create quadratic response surface models and capture both interdie and intradie variations with affordable computation cost. PROBE applies a novel projection scheme to reduce the response surface modeling cost (i.e., both the required number of samples and the linear equation size) and make the modeling problem tractable even for large problem sizes. In addition, a new implicit power iteration algorithm is developed to find the optimal projection space and solve for the unknown model coefficients. Several circuit examples from both digital and analog circuit modeling applications demonstrate that PROBE can generate accurate response surface models while achieving up to 12x speedup compared with the traditional methods. 1.