PREDICTION OF THE GEOMETRIC RENEWAL PROCESS

The first part of the paper presents major concepts and theoretical statements on prediction of processes. The second part presents the obtained results on the geometric renewal process by indicating its distribution which has a binomial distribution and is a process with independent and stationary increments. Further, having applied the theory introduced in the first part to the geometric renewal process, the sufficient and unbiased prediction with the minimum-variance has been found.


Introduction
The concept of prediction sufficiency was introduced by K. Takeuchi and M. Akahira (1975).The primary application of prediction sufficiency was demonstrated by E. N. Torgersen (1977).More comprehensive applications of this concept were demonstrated by B. Johansson (1990).
It is shown that much of the classical theory of unbiased parameter estimation can be transferred to a predictive setting.The main object of the present papers [7,3] is to develop these ideas further and, in particular, to study a close connection which exists between unbiased prediction and time reversal of Markov processes (Björk & Johansson, 1992).Johansson (1990) replaced the usual sufficiency concept by that of prediction sufficiency, so the Rao-Blackwell and Lehmann-Scheffé theorem can be rephrased to suit the above context.
The return from prediction to the parameter estimation theory, enriching the latter by the new findings obtained after prediction, was demonstrated by T. Björk and B. Johansson (1996).These studies investigated Poisson processes, the Yule model, a Wiener process with the unknown drift, diffusion with the unknown drift, and the geometric Brownian motion.
The aim of this research is to find the minimum variance unbiased predictor of the geometric renewal process   ,  > , based on observations {  , 0 ≤  ≤ }.
Major concepts and results of prediction of processes introduced in the second section of the paper have been mostly based on the research study [3].This is displayed in a similar manner in [1], too.In the third section, using paper [6], we introduce a definition of the geometric renewal process, demonstrate that it has a binomial distribution and is a process with independent and stationary increments.The geometric renewal process is called by some authors the discrete Poisson process [9] which, together with the continuous Poisson process, is considered to be classical in the theory of renewal processes.Therefore they are often investigated in monographs dealing with this theory.At the end of the section, the form of the process of local density (Radon-Nikodym derivative) of the geometric renewal process, taken from [5], is presented.Basic concepts of renewal processes are presented in [4].The fourth section displays the found UMSEUP ("Uniformly Minimum Squared Error Unbiased Predictor") predictions of the renewal process, both when a parameter is unknown and when it is known.

Unbiased prediction
We now recall the definitions of a prediction sufficient statistic and main theorems (see e. g. [3,7]).We consider some sample space Ω and two -algebras ℱ 1 and ℱ 2 , where ℱ 1 is generated by some set of random variables which we observe, and ℱ 2 is generated by a set of (yet) unobserved variables.We also have a family  of probability measures on (Ω, ℱ 1 ∨ ℱ 2 ).The objective is to predict some square integrable, ℱ 2 -measurable random variable (r.v.) W. A predictor is any square-integrable, ℱ 1 -measurable r.v.X.The performance of the predictor X is evaluated by its quadratic loss function  →   [( − ) 2 ],  ∈ .The predictor X is called unbiased, if   [] =   [𝑊], ∀ ∈ .The predictor X is said to be complete for  if, for every fixed Borel-function , the condition   [()] = 0,  ∈  implies () = 0, almost surely (a.s.).
It follows that   is binomially distributed and, therefore, the discrete Poisson process is also called a binomial process.
1. Finding of a sufficient statistic.Since the likelihood ratio of the process   , given observations {  , 0 ≤  ≤ }, is according to the factorisation theorem,   is a sufficient statistic to estimate the parameter .Thus, we derive: Therefore, according to the theory, we obtain that is the unbiased predictor of   and then UMSEUP according to Theorem 2. Case 2. Let the parameter ,  ∈ Θ be known.
Since the process   is a process with independent increments, it follows that (  ) =   (  |  ) =   (  +   −   |  ) =   + ([] − []) is the prediction of   which is unbiased and UMSEUP.When basing on formula (1), it is easy to prove that, according to observations {  ,  ∈ [0, ]}, the maximum likelihood estimator of the parameter  is .Hence we can understand the relation between optimal estimators and optimal predictors.