Results 11  20
of
267
An Experimental Study of Temperature Effect on Modal Parameters of the Alamosa Canyon Bridge
 of the Alamosa Canyon Bridge.” Earthquake Eng. and Structural Dynamics
, 1999
"... This paper examines a linear adaptive model to discriminate the changes of modal parameters due to temperature changes from those caused by structural damage or other environmental effects. Data from the Alamosa Canyon Bridge in the state of New Mexico were used to demonstrate the effectiveness of t ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
This paper examines a linear adaptive model to discriminate the changes of modal parameters due to temperature changes from those caused by structural damage or other environmental effects. Data from the Alamosa Canyon Bridge in the state of New Mexico were used to demonstrate the effectiveness of the adaptive filter for this problem. Results indicate that a linear fourinput (two time and two spatial dimensions) filter of temperature can reproduce the natural variability of the frequencies with respect to time of day. Using this simple model, we attempt to establish a confidence interval of the frequencies for a new temperature profile in order to discriminate the natural variation due to temperature
Interpreting canonical correlation analysis through biplots of structural correlations and weights
 Psychometrika
, 1990
"... This paper extends the biplot technique to canonical correlation analysis and redundancy analysis, The plot of structure correlations is shown to be optimal for displaying the pairwise correlations between the variables of the one set and those of the second. The link between multivariate regression ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
This paper extends the biplot technique to canonical correlation analysis and redundancy analysis, The plot of structure correlations is shown to be optimal for displaying the pairwise correlations between the variables of the one set and those of the second. The link between multivariate regression and canonical correlation analysis/redundancy analysis is exploited for producing an optimal biplot that displays a matrix of regression coefficients. This plot can be made from the canonical weights of the predictors and the structure correlations of the criterion variables. An example is used to show how the proposed biptots may be interpreted. Key words: biplot, canonical correlation analysis, canonical weight, interbattery factor analysis, partial analysis, redundancy analysis, regression coefficient, reduced rank regression, structure correlations.
Charting presence in virtual environments and its effects on performance
 DEPARTMENT OF INDUSTRIAL & SYSTEMS ENGINEERING. PH.D. DISSERTATION. VIRGINIA TECH
, 1996
"... Virtual reality (VR) involves an attempt to create an illusion that the user of the VR system is actually present in a synthetic (usually computergenerated) environment. Little is known about how various system parameters affect the illusion of presence in a virtual environment (VE). In particular, ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Virtual reality (VR) involves an attempt to create an illusion that the user of the VR system is actually present in a synthetic (usually computergenerated) environment. Little is known about how various system parameters affect the illusion of presence in a virtual environment (VE). In particular, there seem to be very little quantitative data on which to base VR system design decisions. Also, while presence (or immersion) in VEs is a primary goal of VR, not much is known about how this variable affects task performance. The goal of this research was to provide a ratioscale measure of perceived presence in a VE, to explore the effects of a number of environmental parameters on this measure and construct empirical models of these effects, and to relate perceived presence to user performance. This was done by manipulating eleven independent variables in a series of three experiments. The independent variables manipulated were scene update rate, visual display resolution, field of view, sound, textures, headtracking, stereopsis, virtual personal risk, number of possible interactions, presence of a second user, and environmental detail. Participants performed a set of five tasks in the VE and rated perceived presence at the end of each set using the technique of freemodulus magnitude estimation. The amount of time spent in the VE was also recorded. The results
Inferring pH from diatoms: a comparison of old and new calibration methods. Hydrobiologia
, 1989
"... Two new methods for inferring pH from diatoms are presented. Both are based on the observation that the relationships between diatom taxa and pH are often unimodal. The first method is maximum likelihood calibration based on Gaussian logit response curves of taxa against pH. The second is weighted a ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
Two new methods for inferring pH from diatoms are presented. Both are based on the observation that the relationships between diatom taxa and pH are often unimodal. The first method is maximum likelihood calibration based on Gaussian logit response curves of taxa against pH. The second is weighted averaging. In a lake with a particular pH, taxa with an optimum close to the lake pH will be most abundant, so an intuitively reasonable estimate of the lake pH is to take a weighted average of the pH optima of the species present. Optima and tolerances of diatom taxa were estimated from contemporary pH and proportional diatom counts in littoral zone samples from 97 pristine soft water lakes and pools in Western Europe. The optima showed a strong relation with Hustedt’s pH preference groups. The two new methods were then compared with existing calibration methods on the basis of differences between inferred and observed pH in a test set of 62 additional samples taken between 1918 and 1983. The methods were ranked in order of performance as follows (between brackets the standard error of inferred pH in pH units); maximum likelihood (0.63)> weighted averaging (0.71) = multiple regression using pH groups (0.71) = the Gasse & Tekaia method (0.71)> Renberg & Hellberg’s Index B (0.83) % multiple regression
Kernelbased regression and objective nonlinear measures to assess brain functioning
, 2001
"... Two di®erent problems of re°ecting brain functioning are addressed. This involves human performance monitoring during the signal detection task and depth of anaesthesia monitoring. The common aspect of both problems is to monitor brain activity through the electroencephalogram recordings on the sc ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Two di®erent problems of re°ecting brain functioning are addressed. This involves human performance monitoring during the signal detection task and depth of anaesthesia monitoring. The common aspect of both problems is to monitor brain activity through the electroencephalogram recordings on the scalp. Although these two problems create only a fractional part of the tasks associated with physiological data analysis the results and the methodology proposed have wider applicability. A theoretical and practical investigation of the di®erent forms of kernelbased nonlinear regression models and e±cient kernelbased algorithms for appropriate features extraction is undertaken. The main focus is on solving the problem of providing reduced variance estimates of the regression coe±cients when a linear regression in some kernel function de¯ned feature space is assumed. To that end Kernel Principal Component Regression and Kernel Partial Least Squares Regression techniques are proposed. These kernelbased techniques were found to be very e±cient when observed data are mapped to a high dimensional feature space where usually algorithms as simple as their
A delay damage model selection algorithm for NARX neural networks
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 1997
"... Recurrent neural networks have become popular models for system identification and time series prediction. Nonlinear autoregressive models with exogenous inputs (NARX) neural network models are a popular subclass of recurrent networks and have been used in many applications. Although embedded memory ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Recurrent neural networks have become popular models for system identification and time series prediction. Nonlinear autoregressive models with exogenous inputs (NARX) neural network models are a popular subclass of recurrent networks and have been used in many applications. Although embedded memory can be found in all recurrent network models, it is particularly prominent in NARX models. We show that using intelligent memory order selection through pruning and good initial heuristics significantly improves the generalization and predictive performance of these nonlinear systems on problems as diverse as grammatical inference and time series prediction.
Segmented regression estimators for massive data sets
 In Second SIAM International Conference on Data Mining
, 2002
"... We describe two methodologies for obtaining segmented regression estimators from massive training data sets. The first methodology, called Linear Regression Tree (LRT), is used for continuous response variables, and the second and complementary methodology, called Naive Bayes Tree (NBT), is used for ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
We describe two methodologies for obtaining segmented regression estimators from massive training data sets. The first methodology, called Linear Regression Tree (LRT), is used for continuous response variables, and the second and complementary methodology, called Naive Bayes Tree (NBT), is used for categorical response variables. These are implemented in the IBM ProbE TM (Probabilistic Estimation) data mining engine, which is an objectoriented framework for building classes of segmented predictive models from massive training data sets. Based on this methodology, an application called ATMSE TM for directmail targeted marketing has been developed jointly with Fingerhut Business Intelligence [1]).
Analyzing Resource Behavior Using Process Mining
 BPM 2009 Workshops, Proceedings of the Fifth Workshop on Business Process Intelligence (BPI’09), volume 43 of Lecture Notes in Business Information Processing
, 2010
"... Abstract. It is vital to use accurate models for the analysis, design, and/or control of business processes. Unfortunately, there are often important discrepancies between reality and models. In earlier work, we have shown that simulation models are often based on incorrect assumptions and one examp ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Abstract. It is vital to use accurate models for the analysis, design, and/or control of business processes. Unfortunately, there are often important discrepancies between reality and models. In earlier work, we have shown that simulation models are often based on incorrect assumptions and one example is the speed at which people work. The “YerkesDodson Law of Arousal ” suggests that a worker that is under time pressure may become more efficient and thus finish tasks faster. However, if the pressure is too high, then the worker’s performance may degrade. Traditionally, it was difficult to investigate such phenomena and few analysis tools (e.g., simulation packages) support workloaddependent behavior. Fortunately, more and more activities are being recorded and modern process mining techniques provide detailed insights in the way that people really work. This paper uses a new process mining plugin that has been added to ProM to explore the effect of workload on service times. Based on historic data and by using regression analysis, the relationship between workload and services time is investigated. This information can be used for various types of analysis and decision making, including more realistic forms of simulation.
Filtering requirements for gradientbased optical flow measurement
 IEEE Trans. Image Proc. 2000
"... optical flow measurement ..."
(Show Context)
Airline Reservations Forecasting: Probabilistic and Statistical Models of the Booking Process
, 1990
"... In this thesis, we develop the necessary statistical framework to produce accurate forecasts of total bookings in a particular fare class on a specific flight number departing on a given date at various points before departure. After an introduction to the basic terminology of the airline booking pr ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
In this thesis, we develop the necessary statistical framework to produce accurate forecasts of total bookings in a particular fare class on a specific flight number departing on a given date at various points before departure. After an introduction to the basic terminology of the airline booking process, a rigorous probabilistic model is developed. The booking process is modeled as a stochastic process with requests, reservations, and cancellations interspersed in the time before a flight departs. The key result of the probabilistic analysis is a censored Poisson model of the airline booking process. A comprehensive statistical framework views the booking process from a data analysis perspective. We describe models based on advance bookings (the traditional booking curve) and historical bookings (a traditional time series model). An important development is the