Results 1  10
of
22
Spline Estimators for the Functional Linear Model: Consistency, Application and Splus Implementation
"... The functional linear model is a regression model in which the explanatory variable is a continuous time process observed in a closed interval of R: Hence, the "vector of parameters" to be estimated belongs to the infinite dimensional space of Rvalued operators defined on a space of functions. W ..."
Abstract

Cited by 48 (6 self)
 Add to MetaCart
The functional linear model is a regression model in which the explanatory variable is a continuous time process observed in a closed interval of R: Hence, the "vector of parameters" to be estimated belongs to the infinite dimensional space of Rvalued operators defined on a space of functions. We propose here two estimators of the functional parameter of such a model by means of spline functions. These estimators take into account the dimensionality problem and we prove their consistency. The first one relies on a truncated functional principal components analysis and the second is based on penalized regression splines. These estimators are compared by means of simulations and applied to explain winter wheat yield with respect to climatic variations.
Generalized functional linear models
 Ann. Statist
, 2005
"... We propose a generalized functional linear regression model for a regression situation where the response variable is a scalar and the predictor is a random function. A linear predictor is obtained by forming the scalar product of the predictor function with a smooth parameter function, and the expe ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
We propose a generalized functional linear regression model for a regression situation where the response variable is a scalar and the predictor is a random function. A linear predictor is obtained by forming the scalar product of the predictor function with a smooth parameter function, and the expected value of the response is related to this linear predictor via a link function. If in addition a variance function is specified, this leads to a functional estimating equation which corresponds to maximizing a functional quasilikelihood. This general approach includes the special cases of the functional linear model, as well as functional Poisson regression and functional binomial regression. The latter leads to procedures for classification and discrimination of stochastic processes and functional data. We also consider the situation where the link and variance functions are unknown and are estimated nonparametrically from the data, using a semiparametric quasilikelihood procedure. An essential step in our proposal is dimension reduction by approximating the predictor processes with a truncated KarhunenLoève expansion. We develop asymptotic inference for the proposed class of generalized regression models. In the proposed asymptotic approach, the truncation parameter increases with sample size, and a martingale central limit theorem is applied to establish the resulting increasing dimension asymptotics. We establish asymptotic normality for a properly scaled distance
Functional Linear Model
, 1999
"... In this paper, we study a regression model in which explanatory variables are sampling points of a continuous time process. We propose an estimate of regression by mean of a Functional Principal Component Analysis and analogous to the one introduced by Bosq (1991) in the case of Hilbertian AR proces ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
In this paper, we study a regression model in which explanatory variables are sampling points of a continuous time process. We propose an estimate of regression by mean of a Functional Principal Component Analysis and analogous to the one introduced by Bosq (1991) in the case of Hilbertian AR processes. Both convergence in probability and almost sure convergence of this estimate are stated. Keywords : functional linear model, functional data analysis, Hilbert spaces, convergence. 1 Introduction Classical regression models, such as generalized linear models, may be inadequate in some statistical studies : it is the case when explanatory variables are digitized points of a curve. Examples can be found in different fields of application such as chemometrics (Frank and Friedman, 1993), linguistic (Hastie, Buja and Tibshirani, 1995) and many other areas (see Hastie and Mallows, 1993, and Ramsay and Silverman, 1997, among others). In this context, Frank and Friedman (1993) describe and com...
Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input
, 1998
"... How does a neuron vary its mean output firing rate if the input changes from random to oscillatory coherent but noisy activity? What are the critical parameters of the neuronal dynamics and input statistics? To answer these questions, we investigate the coincidencedetection properties of an integra ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
How does a neuron vary its mean output firing rate if the input changes from random to oscillatory coherent but noisy activity? What are the critical parameters of the neuronal dynamics and input statistics? To answer these questions, we investigate the coincidencedetection properties of an integrateandfire neuron. We derive an expression indicating how coincidence detection depends on neuronal parameters. Specifically, we show how coincidence detection depends on the shape of the postsynaptic response function, the number of synapses, and the input statistics, and we demonstrate that there is an optimal threshold. Our considerations can be used to predict from neuronal parameters whether and to what extent a neuron can act as a coincidence detector and thus can convert a temporal code into a rate code.
Toward a Theory of Information Processing
 IEEE Trans. Signal Processing
, 2002
"... Information processing theory endeavors to quantify how well signals encode information and how well systems, by acting on signals, process information. We use informationtheoretic distance measures, the KullbackLeibler distance in particular, to quantify how well signals represent information. ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
Information processing theory endeavors to quantify how well signals encode information and how well systems, by acting on signals, process information. We use informationtheoretic distance measures, the KullbackLeibler distance in particular, to quantify how well signals represent information. The ratio of distances between a system's output and input quantifies the system's information processing properties.
Timevarying functional regression for predicting remaining lifetime distributions from longitudinal trajectories
 Biometrics
, 2005
"... A recurring objective in longitudinal studies on aging and longevity has been the investigation of the relationship between ageatdeath and current values of a longitudinal covariate trajectory that quantifies reproductive or other behavioral activity. We propose a novel technique for predicting ag ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
A recurring objective in longitudinal studies on aging and longevity has been the investigation of the relationship between ageatdeath and current values of a longitudinal covariate trajectory that quantifies reproductive or other behavioral activity. We propose a novel technique for predicting ageatdeath distributions for situations where an entire covariate history is included in the predictor. The predictor trajectories up to current time are represented by timevarying functional principal component scores, which are continuously updated as time progresses and are considered to be timevarying predictor variables that are entered into a class of timevarying functional regression models that we propose. We demonstrate for biodemographic data how these methods can be applied to obtain predictions for ageatdeath and estimates of remaining lifetime distributions, including estimates of quantiles and of prediction intervals for remaining lifetime. Estimates and predictions are obtained for individual subjects, based on their observed behavioral trajectories, and include a dimensionreduction step that is implemented by projecting on a single index. The proposed techniques are illustrated with data on longitudinal daily egglaying for female medflies, predicting remaining lifetime and ageatdeath distributions from individual event histories observed up to current time. 1
F: Functional additive models
 J Am Stat Assoc
"... In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption and propose to replace it by an additive structure. This leads to a more widely applicable and much more flexible framework for functional regression models. The proposed functional additive regression models are suitable for both scalar and functional responses. The regularization needed for effective estimation of the regression parameter function is implemented through a projection on the eigenbasis of the covariance operator of the functional components in the model. The utilization of functional principal components in an additive rather than linear way leads to substantial broadening of the scope of functional regression models and emerges as a natural approach, as the uncorrelatedness of the functional principal components is shown to lead to a straightforward implementation of the functional additive model, just based on a sequence of onedimensional smoothing steps and without need for backfitting. This facilitates the theoretical analysis, and we establish asymptotic
Functional data analysis for sparse auction data
 In Statistical Methods in eCommerce Research
, 2008
"... Bid arrivals of eBay auctions often exhibit “bid sniping”, a phenomenon where “snipers ” place their bids at the last moments of an auction. This is one reason why bid histories for eBay auctions tend to have sparse data in the middle and denser data both in the beginning and at the end of the aucti ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Bid arrivals of eBay auctions often exhibit “bid sniping”, a phenomenon where “snipers ” place their bids at the last moments of an auction. This is one reason why bid histories for eBay auctions tend to have sparse data in the middle and denser data both in the beginning and at the end of the auction. Time spacing of the bids is thus irregular and sparse. For nearly identical products that are auctioned repeatedly, one may view the price history of each of these auctions as realization of an underlying smooth stochastic process, the price process. While the traditional Functional Data Analysis (FDA) approach requires that entire trajectories of the underlying process are observed without noise, this assumption is not satisfied for typical auction data. We provide a review of a recently developed version of functional principal component analysis (Yao et al., 2005), which is geared towards sparse, irregularly observed and noisy data, the principal analysis through conditional expectation (PACE) method. The PACE method borrows and pools information from the sparse data in all auctions. This allows the recovery of the price process even in situations where only few bids are observed. In a modified approach, we adapt PACE to summarize the bid history for varying current times during an ongoing auction through timevarying principal component scores. These scores then serve as timevarying predictors for the closing price. We study the resulting timevarying predictions using both linear regression and generalized additive modelling, with current scores as predictors. These methods will be illustrated with a case study for 157 Palm M515 PDA auctions from eBay, and the proposed methods are seen to work reasonably well. Other related issues will also be discussed. 1 1
Coherent structures and chaos — a model problem
 Phys. Lett
, 1987
"... The Ginzburgl.andau equation is examined in the chaotic regime. A complete set ofuncorrelated coherent structures is extracted from this motion and used as a basis for the dynamical description of coherent structures in the attractor set. The reduced system is shown to describe motions over a wide ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
The Ginzburgl.andau equation is examined in the chaotic regime. A complete set ofuncorrelated coherent structures is extracted from this motion and used as a basis for the dynamical description of coherent structures in the attractor set. The reduced system is shown to describe motions over a wide parameter set. The aim of this investigation is (1) to isolate the t3 q coherent structures of a chaotic motion and (2) to use the coherent structures as a basis set in a dynamical description of the corresponding attractor set. For this purpose we consider the GinzburgLandau equation [1] under periodic boundary conditions. 1.099,o75This equation, which is of wide current interest ~o39[25], is known to give rise to chaotic motions [ 69]. In particular we focus on the numerical experiments The coefficient q2 of the diffusion tern is easily absorbed in the space variable in which case the box has length 21t/q. In the form written, (1), q2 plays the role of a reciprocal Reynolds number and for this reason is expressed in this way. In the numerical experiments of Moon et al. [ 7] and Keefe [ 8] the constants p and Co are taken to be 1/4 (unless otherwise stated our calculations also adopt these values), and (1) is solved subject to the initial data A = 1 +0.02 cosx. (2) Eq. (1) has exp(it) as a base solution. This solution, known as the Stokes solution, is known to be unstable [24] as q2 is decreased beyond the critical value. The equation then supports a spatially peri
Continuous time threshold autoregressive models, Statistica Sinica
 J. Appl. Prob
, 1991
"... This thesis considers continuous time autoregressive processes defined by stochastic differential equations and develops some methods for modelling time series data by such processes. The first part of the thesis looks at continuous time linear autoregressive (CAR) processes defined by linear stocha ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
This thesis considers continuous time autoregressive processes defined by stochastic differential equations and develops some methods for modelling time series data by such processes. The first part of the thesis looks at continuous time linear autoregressive (CAR) processes defined by linear stochastic differential equations. These processes are wellunderstood and there is a large body of literature devoted to their study. I summarise some of the relevant material and develop some further results. In particular, I propose a new and very fast method of estimation using an approach analogous to the Yule–Walker estimates for discrete time autoregressive processes. The models so estimated may be used for preliminary analysis of the appropriate model structure and as a starting point for maximum likelihood estimation. A natural extension of CAR processes is the class of continuous time threshold autoregressive (CTAR) processes defined by piecewise linear stochastic differential