eu MAXIMUM LIKELIHOOD ESTIMATION WITH DYNAMIC MEASUREMENT ERRORS AND APPLICATION TO INTEREST RATE MODELING

Stochastic volatility (SV) model is widely applied in the extension of the constant volatility in Black-Scholes option pricing. In this paper, we extend the SV model driven by fractional Brownian motion (FBM). A crucial problem in its application is how the unknown parameters in the model are to be estimated. We propose the innovation algorithm, and follow by the maximum likelihood estimation approach, which enables us to derive the estimators of parameters involved in this model. We will also present the simulation outcomes to illustrate the efficiency and reliability of the proposed method.


Introduction
Black-Scholes model is a jewel in financial world.Ever since its debut in 1973 by Black, extensive works have been conducted to improve the performance of this model.Interested readers can refer to [Elliott and Kopp (2005)] for more details in this study.Recent works implore the possibility of extending the constant volatility in Black-Scholes market to the stochastic volatility (SV).This idea is more reasonable in real application.It allows us to capture the impact of timevarying volatility in financial data.However, many econometrician struggled to estimate these models in early years.Only in 1990s that the estimation and the simulation works for these models begin to develop.Readers are referred to some works in [Melino and Turnbull (1990); Andersen and Sørensen (1996); Harvey et. al. (1994)].
This problem keep intrigue researchers.The second generation of works on SV begin to focus on some more advance extension.Some considered jumps in the dynamics of asset prices [Creeel & Kristensen (2015); Cheang Andersen and Bollerslev (1997)].Recent works on this problem can be found in [Casas and Gao (2008); Hu (2004)].Most of these works assumed the model to be generated by the finite parameter model.They used spectral density to extract the memory parameter in its autocovariances to define the long memory in SV model.Some exceptions are works by Høg and Frederiksen (2006) and Hult (2003).They consider the model in continuous time.Fractional Brownian motion (FBM) is considered to define the long memory process.Høg used Kalman filter approach to estimate the parameters, but state the noise as a stationary ARFIMA (0, d, 0) process.Hult (2003) however, used the spectral density in his estimation process.
The study of FBM as the governing noises in some financial model begin in 1968.This is due to works by Mandelbrot and his colleagues [Mandelbrot & van Ness (1968); Mandelbrot & Wallis (1969); Mandelbrot (1972)].Although some has addressed FBM in finance in sceptical view [Rogers (1997)], this issue never fail to capture researchers' interest.Work by Hu and Øksendal (2003) has motivated renewed interest and faith to FBM in financial modeling.Readers are referred to works by [Mandelbrot (2004); Taleb (2007)] for a critical view on the randomness by standard Brownian motion (BM).It is reasonable to consider FBM in the SV model.A crucial problem with the real applications of this model, is how to get the unknown values of the parameters in SV models, in particular, the a, σ and H.They represent the drift, diffusion coefficient and Hurst parameter, respectively.Once these parameters are estimated, the valuing process in option can be done in the SV setting.
Most of earlier works consider Ornstein-Uhlenbeck (OU) process to model this SV model.It is natural to extend this standard OU process in the nature of fractality, by substituting FBM to the standard BM.This is called fractional Ornstein-Uhlenbeck (FOU) process.Some works on FOU have appeared in literature in recent years [Cheridito et. al. (2003); Kleptsyna and Le Breton (2002); Høg and Frederiksen (2006)], but they only developed the theoretical investigation on this matter.Some works on FOU that begin to provide estimations on the proposed methods include Brouste & Iacus (2013;Tanaka (2013) and Salomon & Fort (2013), to mention just a few.We noted that Hult (2003) has estimated the parameter involved in FOU, based on discrete observation by using the spectral representation.We would also like to mention work by Hu (2004) who explore this process in financial environment.He considered this process in option pricing problem.However, this exploratory study too, concerned on the theoretical investigation.In this paper, our aim is to use the innovation algorithm, couple with the maximum likelihood (ML) estimation procedure, to give efficient estimate to the parameters involved in this process.We will also present the numerical investigation to better suit the real-life problem.
In what follows, we summarize some important results reported in Ahmed & Charalambous (2002).Let (Ω, F, P ) be a probability space and H ∈ (0, 1), where H is the index of the self-similarity known as Hurst parameter.A random process {B H , t ≥ 0}, defined on the probability space (Ω, F e , P ) is said to be a FBM if the following conditions are satisfied.
(i) P {B H (0) = 0} = 1; (ii) for each t ∈ R + ≡ [0, ∞), B H (t) is an F e −measurable random variable having Gaussian distribution with E{B H (t)} = 0; , the sample paths of B H are continuous with probability one but nowhere differentiable; and (v) B H is self similar in the sense that, for any α > 0, the probability laws of {B H (αt)} and {α H B H (t)} coincide.

Model Simplification
The fractional Ornstein-Uhlenbeck process can be written in the form of where a 1 > 0 and σ 1 are constant parameters, normally known as drift and diffusion coefficients, respectively.(B H (t), t ≥ 0) is a fractional Brownian motion on some probability space (Ω, F, P ) with Hurst parameter, H.This model is used to replace the constant volatility in risky asset model.Some used this model as a short rate model [De Rossi, 2004].Consider the measurement dynamics as where a 2 ∈ R and σ 2 ∈ R are drift and diffusion coefficients, respectively.In order to simplify this problem, let the systems be considered in discrete processes.The system now becomes: and the measurement dynamics is By considering t = k∆t, and with η k ∼ N (0, (∆t) 2H 2 ) and ξ k ∼ N (0, (∆t) 2H 1 ).Note that ( 7) is an autoregressive (AR) process.We follow iteration process and the Cauchy criterion in [Brockwell and Davis, 1987] and further stated the (AR) process as: with |1 − a 1 ∆t| < 1.
By using the property on the increment of the FBM, {B H (t + ∆t) − B H (t)}, the covariances of η and ξ can be expressed as: and respectively.Note that when H < 1 2 , the increments are negatively correlated whereas H > 1  2 shows the positive correlation.The increment is a stationary process, which is often referred to as fractional Gaussian noise.

Methodology of Estimation
In this section, our aim is to estimate the parameters a 1 , a 2 , σ 1 , σ 2 , H 1 and H 2 involved in the model considered.Here, the likelihood function is presented for our problem.By considering (8), we illustrate the likelihood function for ( Ỹ1 , ..., ỸT ) as follows: ) It is known that the maximum likelihood imply the efficient estimates of parameters.However, in (13), it is very difficult to find the gradient of the likelihood function with respect to the parameters.This system is too involved.In order to cater this problem, the innovation algorithm is implemented.
Hence, the autocovariance function, From ( 22), Σ −1 T can be written as and From ( 22) and ( 23), the likelihood function ( 13) can now be calculated.Now, the likelihood function is transformed into the following optimization problem.

Problem P. Maximizes the cost function
subject to where θ = (a 1 , a 2 , σ 1 , σ 2 , H 1 , H 2 ).This optimization problem is very difficult to solve.The constraints are too involved with covariance functions.To simplify this problem, we use the constraint transcription method reported in Jenning and Teo 1990.
Maximizes the cost function: where g i are the constraints in the original problem.Let this problem be referred to as Problem P. For each i = 1, 2, we approximate g i with G i,ε (θ), where where ε some small number.We now append the approximate functions G i,ε into the cost function L(θ) to an appended cost function given below.
where γ > 0 is a penalty parameter.This is an unconstraint optimization problem, which is referred to as Problem P ε,γ .It is known (see Jenning and Teo 1990) that for any given ε > 0, there exists a γ(ε) such that, for γ > γ(ε), the solution of Problem P ε,γ will satisfy the constraints of Problem P. Let γ(ε) be such a γ for each ε > 0. Furthermore, the solution of Problem P ε,γ(ε) converges to the solution of Problem P.
We propose an algorithm to solve Problem P ε,γ .

Algorithm
Here, we present an algorithm to compute the likelihood function using the innovation algorithm.

Simulation Study
In order to examine the performance of the proposed estimators, we have carried out some simulation experiments.First, we generate the data from model (8).We take the parameters a 1 = 2, a 2 = 2.5, σ 1 = 1, σ 2 = 0.5, H 1 = 0.7 and H 2 = 0.6, while ∆t = 1  5 .We simulate the time series from this discrete time model and apply our methodology to estimate the parameters ϑ = (a 1 , a 2 , σ 1 , σ 2 , H 1 , H 2 )′ using the simulated data set.The simulated annealing method is used in order to find the optimal parameters, simultaneously.The simulation is repeated one hundred times.The simulated outcomes of the average value of estimates based on 100 replications, with bias and variances, are reported in Table 1.The 5 cases of sample sizes n = 100, 200, 300, 400, 500 are considered in the table.From the results obtained in Table 1, it shows that our methodology is efficient.Most of the biases and variance obtained are within an acceptable tolerance.All of our estimates for σ 1 , σ 2 , H 1 and H 2 are obviously quite stable and less biased.The performance on the estimation of a 1 and a 2 are fairly satisfactory.This simulation outcomes indicate that our methodology is promising in obtaining statistically efficient estimators for FOU.

Data
We used a data set from federal reserve interest rate available online at http://www.federalreserve.gov.The interest rate of business day from 2 January 2009 to 31 December 2009 is examined, with 252 observations.A summary of the time series can be found in Table 2, where the mean of this series is 0.1597 and the variance is 0.001502.
Here, we take ∆t = 1 5 which is similar to that of previous section.We can see from Table 3 that the suggested estimates are a 1 = 2.277, a 2 = 4.237, σ 1 = 0.509, σ 2 = 0.713, H 1 = 0.809 and H 2 = 0.982.Obviously the longmemory property is rather strong in interest rate process.

Conclusion
In this paper, we proposed a novel method for estimating the unknown parameters in fractional Ornstein-Uhlenbeck (FOU) model.The likelihood function for FOU is difficult to solve analytically.Its covariance calculation is very expensive.Due to this, we proposed the innovation algorithm approach to simplify this problems.The likelihood function now transformed to an optimization problem with some constraints.The constraints transcription method is applied to append the approximate constraints to the cost function.Then, we solved a standard unconstrained optimization problem, by using simulated annealing method.We carried out some simulation study so as to illustrate the efficiency of our method.We also carried out empirical study to the interest rate data.

Table 1 :
Average value of estimates based on 100 replications, with bias in ( ) and variance in [ ]

Table 2 :
Summary of the interest rate

Table 3 :
Likelihood value of different initials