Results 1 
8 of
8
INFERRING THE CONDITIONAL MEAN
, 710
"... Abstract. Consider a stationary realvalued time series {Xn} ∞ n=0 with a priori unknown distribution. The goal is to estimate the conditional expectation E(Xn+1X0,..., Xn) based on the observations (X0,..., Xn) in a pointwise consistent way. It is well known that this is not possible at all value ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. Consider a stationary realvalued time series {Xn} ∞ n=0 with a priori unknown distribution. The goal is to estimate the conditional expectation E(Xn+1X0,..., Xn) based on the observations (X0,..., Xn) in a pointwise consistent way. It is well known that this is not possible at all values of n. We will estimate it along stopping times.
Forward estimation for ergodic time series
, 2005
"... The forward estimation problem for stationary and ergodic time series {Xn} ∞ n=0 taking values from a finite alphabet X is to estimate the probability that Xn+1 = x based on the observations Xi, 0 ≤ i ≤ n without prior knowledge of the distribution of the process {Xn}. We present a simple procedure ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The forward estimation problem for stationary and ergodic time series {Xn} ∞ n=0 taking values from a finite alphabet X is to estimate the probability that Xn+1 = x based on the observations Xi, 0 ≤ i ≤ n without prior knowledge of the distribution of the process {Xn}. We present a simple procedure gn which is evaluated on the data segment (X0,...,Xn) and for which, error(n) = gn(x) − P(Xn+1 = xX0,...,Xn)  → 0 almost surely for a subclass of all stationary and ergodic time series, while for the full class the Cesaro average of the error tends to zero almost surely and moreover, the error tends to zero in probability. Le problème d’estimation future d’une série de temps ergodique et stationnaire {Xn} ∞ n=0, qui prend ses valeures dans un alphabet fini X, est d’estimer la probabilité que Xn+1 = x, connaissant les Xi pour 0 ≤ i ≤ n mais sans connaissance préalable de la distribution du
G. Morvai and B. Weiss: Limitations on intermittent
, 710
"... Bailey showed that the general pointwise forecasting for stationary and ergodic time series has a negative solution. However, it is known that for Markov chains the problem can be solved. Morvai showed that there is a stopping time sequence {λn} such that P(Xλn+1 = 1X0,...,Xλn) can be estimated fro ..."
Abstract
 Add to MetaCart
Bailey showed that the general pointwise forecasting for stationary and ergodic time series has a negative solution. However, it is known that for Markov chains the problem can be solved. Morvai showed that there is a stopping time sequence {λn} such that P(Xλn+1 = 1X0,...,Xλn) can be estimated from samples (X0,...,Xλn) such that the difference between the conditional probability and the estimate vanishes along these stoppping times for all stationary and ergodic binary time series. We will show it is not possible to estimate the above conditional probability along a stopping time sequence for all stationary and ergodic binary time series in a pointwise sense such that if the time series turns out to be a Markov chain, the predictor will predict eventually for all n.
ON UNIVERSAL ESTIMATES FOR BINARY RENEWAL PROCESSES
, 811
"... A binary renewal process is a stochastic process {Xn} taking values in {0,1} where the lengths of the runs of 1’s between successive zeros are independent. After observing X0,X1,...,Xn one would like to predict the future behavior, and the problem of universal estimators is to do so without any prio ..."
Abstract
 Add to MetaCart
A binary renewal process is a stochastic process {Xn} taking values in {0,1} where the lengths of the runs of 1’s between successive zeros are independent. After observing X0,X1,...,Xn one would like to predict the future behavior, and the problem of universal estimators is to do so without any prior knowledge of the distribution. We prove a variety of results of this type, including universal estimates for the expected time to renewal as well as estimates for the conditional distribution of the time to renewal. Some of our results require a moment condition on the time to renewal and we show by an explicit construction how some moment condition is necessary. 1. Introduction. The
G. Morvai and B. Weiss: On Classifying Processes.
, 710
"... We prove several results concerning classifications, based on successive observations (X1,...,Xn) of an unknown stationary and ergodic process, for membership in a given class of processes, such as the class of all finite order Markov chains. ..."
Abstract
 Add to MetaCart
We prove several results concerning classifications, based on successive observations (X1,...,Xn) of an unknown stationary and ergodic process, for membership in a given class of processes, such as the class of all finite order Markov chains.
ESTIMATING THE LENGTHS OF MEMORY WORDS 1 Estimating the Lengths of Memory Words
, 808
"... Abstract—For a stationary stochastic process {Xn} with values in some set A, a finite word w ∈ A K is called a memory word if the conditional probability of X0 given the past is constant on the cylinder set defined by X −1 −K = w. It is a called a minimal memory word if no proper suffix of w is also ..."
Abstract
 Add to MetaCart
Abstract—For a stationary stochastic process {Xn} with values in some set A, a finite word w ∈ A K is called a memory word if the conditional probability of X0 given the past is constant on the cylinder set defined by X −1 −K = w. It is a called a minimal memory word if no proper suffix of w is also a memory word. For example in a Kstep Markov processes all words of length K are memory words but not necessarily minimal. We consider the problem of determining the lengths of the longest minimal memory words and the shortest memory words of an unknown process {Xn} based on sequentially observing the outputs of a single sample {ξ1, ξ2,...ξn}. We will give a universal estimator which converges almost surely to the length of the longest minimal memory word and show that no such universal estimator exists for the length of the shortest memory word. The alphabet A may be finite or countable. Index Terms—Markov chains, order estimation, probability, statistics, stationary processes, stochastic processes I.
On Sequential Estimation and Prediction for Discrete Time Series
, 803
"... The problem of extracting as much information as possible from a sequence of observations of a stationary stochastic process X0,X1,...Xn has been considered by many authors from different points of view. It has long been known through the work of D. Bailey that no universal estimator for P(Xn+1X0,X ..."
Abstract
 Add to MetaCart
The problem of extracting as much information as possible from a sequence of observations of a stationary stochastic process X0,X1,...Xn has been considered by many authors from different points of view. It has long been known through the work of D. Bailey that no universal estimator for P(Xn+1X0,X1,...Xn) can be found which converges to the true estimator almost surely. Despite this result, for restricted classes of processes, or for sequences of estimators along stopping times, universal estimators can be found. We present here a survey of some of the recent work that has been done along these lines.