1. Ekonometriese modelle in finansiële risiko.
- Author
-
DE JONGH, P. J.
- Subjects
- *
ECONOMETRICS , *FINANCIAL risk , *MARKET volatility , *ECONOMIC models , *MATHEMATICAL models , *ECONOMIC statistics , *DISTRIBUTION (Probability theory) - Abstract
This paper provides an overview of the contributions by prof JH Venter to financial risk and volatility, modelling, estimation and forecasting. Venter's research is based on the classical GARCH model which he refines in various ways. In the classical GARCH model the innovation distribution is assumed to be standard normal but recent research emphasized the need for more general distributions allowing both asymmetry (skewness) and kurtosis in the innovation distribution to obtain better fitting models. This can be achieved by variance mixtures of normal distributions. If the mixing variable is taken as unit inverse Gaussian distributed the resulting innovation distribution has the NIG-distribution whose density can be computed analytically, easing likelihood calculations. In essence this is the NIG-GARCH model for daily returns. Venter interprets the mixing variable as a latent factor due to news noise impacts that adjust the traditional GARCH volatilities to account for events occurring after market closure on the previous day. This GARCH model with NIG-distributed innovations leads to more accurate parameter estimates than the normal GARCH model Venter concludes that it is the mixing concept and not the particular distribution choice that leads to better fitting models. This is encouraging in the sense that it is the underlying phenomenon that one is trying to model that is important, more so than the specific mathematical forms that one uses in the process, and the results should be stable when these forms are varied over reasonably possible alternatives. These models are fitted to an empirical data set and in the process of doing so, Venter finds empirical support for Merton's ICAPM. In order to obtain even more accurate parameter estimates, and since Venter expected an information gain if more data is used, he extends the above-mentioned model to cater for high, low and close data, as well as full intraday data, instead of only daily returns. This is achieved by introducing the Brownian inverse Gaussian (BIG) process, which follows naturally from the unit inverse Gaussian distribution and standard Brownian motion. This model postulates that over each trading day anew the intra-day return process follows a Brownian motion with drift and volatility that are inherited from the previous day in typical GARCH fashion but are also subject to random Inverse Gaussian (IG) distributed news noise impacts that arrive after market closure on the previous day. He calls it the BIG-GARCH model and derives the likelihood function needed to fit the model; this uses the daily returns and realized volatilities as sufficient statistics. Venter then introduces a number of new distributions related to the Inverse Gaussian-distribution and also derives diagnostics that may be used to check the quality of fit. The new model produces two volatility measures, called the expected volatility and the actual volatility. Venter shows that the latter is close to the realized volatility. Fitting these models to empirical data, he finds that the accuracy of the model fit increases as one moves from the models assuming normally distributed innovations and allowing for only daily data to those assuming underlying BIG processes and allowing for full intraday data. However, Venter encounters one problematic result, namely that there is empirical evidence of time dependence in the random impact factors. This means that the news noise processes, which is assumed to be independent over time, are indeed time dependent, as can actually be expected. In order to cater for this time dependence, Venter extends the model still further by allowing for autocorrelation in the random impact factors.… [ABSTRACT FROM AUTHOR]
- Published
- 2008