120 results on '"Ngai Hang Chan"'
Search Results
2. Inference for Structural Breaks in Spatial Models
- Author
-
Ngai Hang Chan, Rongmao Zhang, and Chun Yip Yau
- Subjects
Statistics and Probability ,Statistics, Probability and Uncertainty - Published
- 2023
3. Cointegration Rank Estimation for High-Dimensional Time Series With Breaks
- Author
-
Ngai Hang Chan and Rongmao Zhang
- Subjects
Statistics and Probability ,Statistics, Probability and Uncertainty - Published
- 2023
4. Broad Distribution of Local Polar States Generates Large Electrothermal Properties in Pb-Free Relaxor Ferroelectrics
- Author
-
Matthew G. Tucker, Sanjib Nayak, Ngai Hang Chan, Mads R. V. Jørgensen, Yuanpeng Zhang, Sarangi Venkateshwarlu, Abhijit Pramanick, Jing Kong, and Frederick P. Marlton
- Subjects
Global energy ,Materials science ,Distribution (number theory) ,General Chemical Engineering ,Waste heat ,Materials Chemistry ,Energy transformation ,Polar ,General Chemistry ,Energy harvesting ,Molecular physics ,Pyroelectricity - Abstract
Electrothermal energy conversion provides attractive solutions for global energy management, such as energy harvesting from waste heat using pyroelectric energy conversion (PEC) and efficient cooling of portable electronics or data servers using the electrocaloric effect. Relaxor ferroelectrics are attractive for electrothermal energy conversion because of their large pyroelectric coefficients over a wide temperature range. Although Pb-based relaxors are well-known, toxicity concerns have mandated the intense search for Pb-free alternatives. Here, we engineered (Ba,Ca)TiO3-based relaxors based on a multisite doping strategy, which show promising electrothermal performance, viz. a maximum PEC efficiency of 14% and electrocaloric refrigeration capacity of 115 J/kg. Using local-scale structural analysis, we provide an atomistic model for large electrothermal properties in the newly designed Pb-free ferroelectrics, whereby a temperature-independent continuous distribution of cation displacement directions creates easy pathways for microscopic polarization reorientation. This research provides key structural insight for future atomic-scale engineering of environmentally sustainable ferroelectrics in energy applications.
- Published
- 2021
5. Simultaneous variable selection and structural identification for time‐varying coefficient models
- Author
-
W. Palma, Ngai Hang Chan, and G. Linhao
- Subjects
Statistics and Probability ,Identification (information) ,business.industry ,Applied Mathematics ,Pattern recognition ,Information Criteria ,Feature selection ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,Group lasso ,Mathematics - Published
- 2021
6. Nonparametric testing for the specification of spatial trend functions
- Author
-
Rongmao Zhang, Ngai Hang Chan, and Changxiong Chi
- Subjects
Statistics and Probability ,Numerical Analysis ,Statistics, Probability and Uncertainty - Published
- 2023
7. Consistent order selection for ARFIMA processes
- Author
-
Hsueh-Han Huang, Ngai Hang Chan, Kun Chen, and Ching-Kang Ing
- Subjects
Statistics and Probability ,Statistics, Probability and Uncertainty - Published
- 2022
8. Group orthogonal greedy algorithm for change-point estimation of multivariate time series
- Author
-
Rongmao Zhang, Ngai Hang Chan, Yuanbo Li, and Chun Yip Yau
- Subjects
Statistics and Probability ,Multivariate statistics ,Applied Mathematics ,05 social sciences ,Monte Carlo method ,Structural break ,Feature selection ,01 natural sciences ,010104 statistics & probability ,Autoregressive model ,0502 economics and business ,Piecewise ,Point estimation ,0101 mathematics ,Statistics, Probability and Uncertainty ,Greedy algorithm ,Algorithm ,050205 econometrics ,Mathematics - Abstract
This paper proposes a three-step method for detecting multiple structural breaks for piecewise stationary vector autoregressive processes. The number of structural breaks can be large and unknown with the locations of the breaks being different among different components. The proposed method is established via a link between a structural break problem and a high-dimensional regression problem. By means of this connection, a group orthogonal greedy algorithm, originated from the high-dimensional variable selection context, is developed for efficiently screening out potential break-points in the first step. A high-dimensional information criterion is proposed for consistent structural breaks estimation in the second step. In the third step, the information criterion further determines the specific components in which structural breaks occur. Monte Carlo experiments are conducted to demonstrate the finite sample performance, and applications to stock data are provided to illustrate the proposed method.
- Published
- 2021
9. Penalized Whittle likelihood for spatial data
- Author
-
Kun Chen, Ngai Hang Chan, Chun Yip Yau, and Jie Hu
- Subjects
Statistics and Probability ,Numerical Analysis ,Statistics, Probability and Uncertainty - Published
- 2023
10. NONSTATIONARY LINEAR PROCESSES WITH INFINITE VARIANCE GARCH ERRORS
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Economics and Econometrics ,Stochastic process ,Autoregressive conditional heteroskedasticity ,05 social sciences ,Estimator ,Variance (accounting) ,Random walk ,01 natural sciences ,010104 statistics & probability ,Distribution (mathematics) ,0502 economics and business ,Ordinary least squares ,Applied mathematics ,Limit (mathematics) ,0101 mathematics ,Social Sciences (miscellaneous) ,050205 econometrics ,Mathematics - Abstract
Recently, Cavaliere, Georgiev, and Taylor (2018, Econometric Theory 34, 302–348) (CGT) considered the augmented Dickey–Fuller (ADF) test for a unit-root model with linear noise driven by i.i.d. infinite variance innovations and showed that ordinary least squares (OLS)-based ADF statistics have the same distribution as in Chan and Tran (1989, Econometric Theory 5, 354–362) for i.i.d. infinite variance noise. They also proposed an interesting question to extend their results to the case with infinite variance GARCH innovations as considered in Zhang, Sin, and Ling (2015, Stochastic Processes and their Applications 125, 482–512). This paper addresses this question. In particular, the limit distributions of the ADF for random walk models with short-memory linear noise driven by infinite variance GARCH innovations are studied. We show that when the tail index $\alpha , the limit distributions are completely different from that of CGT and the estimator of the parameters of the lag terms used in the ADF regression is not consistent. This paper provides a broad treatment of unit-root models with linear GARCH noises, which encompasses the commonly entertained unit-root IGARCH model as a special case.
- Published
- 2020
11. Inference for the degree distributions of preferential attachment networks with zero-degree nodes
- Author
-
Ngai Hang Chan, Samuel P.S. Wong, and Simon K. C. Cheung
- Subjects
Economics and Econometrics ,Logarithm ,Applied Mathematics ,05 social sciences ,Conditional probability ,Asymptotic distribution ,Degree distribution ,Preferential attachment ,01 natural sciences ,010104 statistics & probability ,Delta method ,0502 economics and business ,Consistent estimator ,Statistical inference ,Applied mathematics ,0101 mathematics ,050205 econometrics ,Mathematics - Abstract
The tail of the logarithmic degree distribution of networks decays linearly with respect to the logarithmic degree is known as the power law and is ubiquitous in daily lives. A commonly used technique in modeling the power law is preferential attachment (PA), which sequentially joins each new node to the existing nodes according to the conditional probability law proportional to a linear function of their degrees. Although effective, it is tricky to apply PA to real networks because the number of nodes and that of edges have to satisfy a linear constraint. This paper enables real application of PA by making each new node as an isolated node that attaches to other nodes according to PA scheme in some later epochs. This simple and novel strategy provides an additional degree of freedom to relax the aforementioned constraint to the observed data and uses the PA scheme to compute the implied proportion of the unobserved zero-degree nodes. By using martingale convergence theory, the degree distribution of the proposed model is shown to follow the power law and its asymptotic variance is proved to be the solution of a Sylvester matrix equation, a class of equations frequently found in the control theory (see Hansen and Sargent (2008, 2014)). These results give a strongly consistent estimator for the power-law parameter and its asymptotic normality. Note that this statistical inference procedure is non-iterative and is particularly applicable for big networks such as the World Wide Web presented in Section 6 . Moreover, the proposed model offers a theoretically coherent framework that can be used to study other network features, such as clustering and connectedness, as given in Cheung (2016).
- Published
- 2020
12. Optimal change-point estimation in time series
- Author
-
Chun Yip Yau, Haihan Yu, Wai Leong Ng, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Optimal estimation ,Mean squared error ,Series (mathematics) ,Applied mathematics ,Estimator ,Asymptotic distribution ,Point estimation ,Statistics, Probability and Uncertainty ,Asymptotic theory (statistics) ,Minimax ,Mathematics - Abstract
This paper establishes asymptotic theory for optimal estimation of change points in general time series models under α-mixing conditions. We show that the Bayes-type estimator is asymptotically minimax for change-point estimation under squared error loss. Two bootstrap procedures are developed to construct confidence intervals for the change points. An approximate limiting distribution of the change-point estimator under small change is also derived. Simulations and real data applications are presented to investigate the finite sample performance of the Bayes-type estimator and the bootstrap procedures.
- Published
- 2021
13. Walsh Fourier Transform of Locally Stationary Time Series
- Author
-
Zhelin Huang and Ngai Hang Chan
- Subjects
Statistics and Probability ,Series (mathematics) ,business.industry ,Applied Mathematics ,05 social sciences ,Pattern recognition ,01 natural sciences ,Time–frequency analysis ,Set (abstract data type) ,010104 statistics & probability ,Tree (data structure) ,symbols.namesake ,Fourier transform ,0502 economics and business ,symbols ,Feature (machine learning) ,Artificial intelligence ,0101 mathematics ,Statistics, Probability and Uncertainty ,Time series ,business ,050205 econometrics ,Mathematics ,Test data - Abstract
A new time‐frequency model and a method to classify time series data are proposed in this article. By viewing the observed signals as realizations of locally dyadic stationary (LDS) processes, a LDS model can be used to provide a time‐frequency decomposition of the signals, under which the evolutionary Walsh spectrum and related statistics can be defined and estimated. The classification procedure is as follows. First choose a training data set that comprises two groups of time series with a known group. Then compute the time frequency feature (the energy) using the training data set, and use a best tree method to maximize the discrepancy of this feature between the two groups. Finally, choose the testing data set with the unknown group as validation data, and use a discriminant statistic to classify the validation data to one of the groups. The classification method is illustrated via an electroencephalographic dataset and the Ericsson B transaction time dataset. The proposed classification method performs better for integer‐valued time series in terms of classification error rates in both simulations and real‐life applications.
- Published
- 2019
14. Efficient inference for nonlinear state space models: An automatic sample size selection rule
- Author
-
Ngai Hang Chan and Jing Cheng
- Subjects
Statistics and Probability ,Stochastic volatility ,Computer science ,Applied Mathematics ,05 social sciences ,Inference ,01 natural sciences ,010104 statistics & probability ,Computational Mathematics ,Nonlinear system ,Computational Theory and Mathematics ,Sample size determination ,Monte carlo expectation maximization ,0502 economics and business ,State space ,Renewal theory ,0101 mathematics ,Algorithm ,Selection (genetic algorithm) ,050205 econometrics - Abstract
This paper studies the maximum likelihood estimation of nonlinear state space models. Particle Markov chain Monte Carlo method is introduced to implement the Monte Carlo expectation maximization algorithm for more accurate and robust estimation. Under this framework, an automated sample size selection criterion is constructed via renewal theory. This criterion would increase the sample size when the relative likelihood indicates that the parameters are close to each other. The proposed methodology is applied to the stochastic volatility model and another nonlinear state space model for illustration, where the results show better estimation performance.
- Published
- 2019
15. Bartlett correction of frequency domain empirical likelihood for time series with unknown innovation variance
- Author
-
Kun Chen, Chun Yip Yau, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Statistics::Theory ,Series (mathematics) ,05 social sciences ,Coverage probability ,Inference ,Variance (accounting) ,01 natural sciences ,010104 statistics & probability ,Empirical likelihood ,Frequency domain ,0502 economics and business ,Statistics ,Statistics::Methodology ,Limit (mathematics) ,0101 mathematics ,050205 econometrics ,Confidence region ,Mathematics - Abstract
The Bartlett correction is a desirable feature of the likelihood inference, which yields the confidence region for parameters with improved coverage probability. This study examines the Bartlett correction for the frequency domain empirical likelihood (FDEL), based on the Whittle likelihood of linear time series models. Nordman and Lahiri (Ann Stat 34:3019–3050, 2006) showed that the FDEL does not have an ordinary Chi-squared limit when the innovation is non-Gaussian with unknown variance, which restricts the use of the FDEL inference in time series. We show that, by profiling the innovation variance out of the Whittle likelihood function, the FDEL is Chi-squared-distributed and Bartlett correctable. In particular, the order of the coverage error of the confidence region can be reduced from $$O(n^{-1})$$ to $$O(n^{-2})$$ .
- Published
- 2019
16. On Bartlett correction of empirical likelihood for regularly spaced spatial data
- Author
-
Chun Yip Yau, Kun Chen, Ngai Hang Chan, and Man Wang
- Subjects
Statistics and Probability ,Covariance function ,05 social sciences ,Coverage error ,Edgeworth series ,Covariance ,01 natural sciences ,010104 statistics & probability ,Empirical likelihood ,Section (archaeology) ,0502 economics and business ,Statistics ,Spatial frequency ,0101 mathematics ,Statistics, Probability and Uncertainty ,Spatial analysis ,050205 econometrics ,Mathematics - Abstract
ENTHIS LINK GOES TO A ENGLISH SECTIONFRTHIS LINK GOES TO A FRENCH SECTION Bartlett correction constitutes one of the attractive features of empirical likelihood because it enables the construction of confidence regions for parameters with improved coverage probabilities. We study the Bartlett correction of spatial frequency domain empirical likelihood (SFDEL) based on general spectral estimating functions for regularly spaced spatial data. This general formulation can be applied to testing and estimation problems in spatial analysis, for example testing covariance isotropy, testing covariance separability as well as estimating the parameters of spatial covariance models. We show that the SFDEL is Bartlett correctable. In particular, the improvement in coverage accuracies of the Bartlett‐corrected confidence regions depends on the underlying spatial structures. The Canadian Journal of Statistics 47: 455–472; 2019 © 2019 Statistical Society of Canada
- Published
- 2019
17. Subgroup analysis of zero-inflated Poisson regression model with applications to insurance data
- Author
-
Rui Huang, Chun Yip Yau, Ngai Hang Chan, and Kun Chen
- Subjects
Statistics and Probability ,Economics and Econometrics ,Group (mathematics) ,Computer science ,Subgroup analysis ,Regression analysis ,Function (mathematics) ,symbols.namesake ,Statistics ,Covariate ,Convergence (routing) ,symbols ,Zero-inflated model ,Statistics::Methodology ,Poisson regression ,Statistics, Probability and Uncertainty - Abstract
Customized personal rate offering is of growing importance in the insurance industry. To achieve this, an important step is to identify subgroups of insureds from the corresponding heterogeneous claim frequency data. In this paper, a penalized Poisson regression approach for subgroup analysis in claim frequency data is proposed. Subjects are assumed to follow a zero-inflated Poisson regression model with group-specific intercepts, which capture group characteristics of claim frequency. A penalized likelihood function is derived and optimized to identify the group-specific intercepts and effects of individual covariates. To handle the challenges arising from the optimization of the penalized likelihood function, an alternating direction method of multipliers algorithm is developed and its convergence is established. Simulation studies and real applications are provided for illustrations.
- Published
- 2019
18. Portmanteau-type tests for unit-root and cointegration
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Economics and Econometrics ,Cointegration ,Applied Mathematics ,05 social sciences ,Nonparametric statistics ,Portmanteau ,Sample (statistics) ,01 natural sciences ,010104 statistics & probability ,Autoregressive model ,0502 economics and business ,Portmanteau test ,Applied mathematics ,Unit root ,0101 mathematics ,Statistic ,050205 econometrics ,Mathematics - Abstract
This paper proposes a new portmanteau-type statistic by combining several lags of the sample autocorrelations to test for the presence of a unit-root of an autoregressive model. The proposed method is nonparametric in nature, which is model free and easy to implement. It avoids modeling the fitted residuals and does not require estimation of nuisance parameters, as commonly done in the augmented Dickey–Fuller or Phillips–Perron procedure. Asymptotic properties of the test are established under general stationary conditions on the noises. Finite sample studies are also reported to illustrate the superior power of the proposed method. Applications to test for cointegration are also given.
- Published
- 2018
19. Self-Normalized Sequential Change-point Detection
- Author
-
Ngai Hang Chan, Wai Leong Ng, and Chun Yip Yau
- Subjects
Statistics and Probability ,business.industry ,Self normalized ,Pattern recognition ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,Change detection ,Mathematics - Published
- 2021
20. Lasso-based Variable Selection of ARMA Models
- Author
-
Shiqing Ling, Chun Yip Yau, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Lasso (statistics) ,business.industry ,Feature selection ,Artificial intelligence ,Statistics, Probability and Uncertainty ,Machine learning ,computer.software_genre ,business ,computer ,Mathematics - Published
- 2020
21. Modeling eBay price using stochastic differential equations
- Author
-
Ngai Hang Chan, Wei Wei Liu, and Yan Liu
- Subjects
050208 finance ,Computer science ,Strategy and Management ,05 social sciences ,Management Science and Operations Research ,01 natural sciences ,Computer Science Applications ,Online auction ,010104 statistics & probability ,Stochastic differential equation ,Modeling and Simulation ,0502 economics and business ,Applied mathematics ,0101 mathematics ,Statistics, Probability and Uncertainty - Published
- 2018
22. Mildly explosive autoregression with mixing innovations
- Author
-
Haejune Oh, Sangyeol Lee, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Statistics::Theory ,Explosive material ,Series (mathematics) ,Limit distribution ,05 social sciences ,Cauchy distribution ,Bayesian inference ,01 natural sciences ,010104 statistics & probability ,Autoregressive model ,0502 economics and business ,Econometrics ,0101 mathematics ,Mixing (physics) ,050205 econometrics ,Mathematics - Abstract
In this paper, the limit distribution of the least squares estimator for mildly explosive autoregressive models with strong mixing innovations is established, which is shown to be Cauchy as in the iid case. The result is applied to identify the onset and the end of an explosive period of an econometric time series. Simulations and data analysis are also conducted to demonstrate the usefulness of the result.
- Published
- 2018
23. On the Estimation of Locally Stationary Long-Memory Processes
- Author
-
Wilfredo Omar Palma Manríquez and Ngai Hang Chan
- Subjects
Statistics and Probability ,Estimation ,Computer science ,Long memory ,Statistics, Probability and Uncertainty ,Algorithm - Published
- 2019
24. Short-Term Stock Price Prediction Based on Limit Order Book Dynamics
- Author
-
Yang An and Ngai Hang Chan
- Subjects
050208 finance ,Strategy and Management ,05 social sciences ,Conditional probability ,Conditional probability distribution ,Management Science and Operations Research ,Poisson distribution ,01 natural sciences ,Computer Science Applications ,010104 statistics & probability ,symbols.namesake ,Empirical likelihood ,Modeling and Simulation ,0502 economics and business ,Compound Poisson process ,Econometrics ,symbols ,Zero-inflated model ,Economics ,Probability distribution ,Limit (mathematics) ,0101 mathematics ,Statistics, Probability and Uncertainty - Abstract
Interaction of capital market participants is a complicated dynamic process. A stochastic model is proposed to describe the dynamics to predict short-term stock price behaviors. Independent compound Poisson processes are introduced to describe the occurrences of market orders, limit orders and cancellations of limit orders, respectively. Based on high-frequency observations of the limit order book, the maximum empirical likelihood estimator (MELE) is applied to estimate the parameters of the compound Poisson processes. Moreover, an analytical formula is derived to compute the probability distribution of the first-passage time of a compound Poisson process. Based on this formula, the conditional probability of a price increase and the conditional distribution of the duration until the first change in mid-price are obtained. A novel approach of short-term stock price prediction is proposed and this methodology works reasonably well in the data analysis. Copyright © 2016 John Wiley & Sons, Ltd.
- Published
- 2016
25. Artifactual unit root behavior of Value at risk (VaR)
- Author
-
Tony Sit and Ngai Hang Chan
- Subjects
Statistics and Probability ,0209 industrial biotechnology ,Autoregressive conditional heteroskedasticity ,Financial risk management ,02 engineering and technology ,Random walk ,01 natural sciences ,010104 statistics & probability ,020901 industrial engineering & automation ,Statistics ,Econometrics ,Portfolio ,Unit root ,0101 mathematics ,Statistics, Probability and Uncertainty ,Value at risk ,Parametric statistics ,Quantile ,Mathematics - Abstract
An effective model for time-varying quantiles of a time series is of considerable practical importance across various disciplines. In particular, in financial risk management, computation of Value-at-risk (VaR), one of the most popular risk measures, involves knowledge of quantiles of portfolio returns. This paper examines the random walk behavior of VaRs constructed under two most common approaches, viz. historical simulation and the parametric approach using GARCH models. We find that sequences of historical VaRs appear to follow a unit root model, which can be an artifact under some settings, whereas its counterpart constructed via the parametric approach does not follow a random walk model by default.
- Published
- 2016
26. Factor Modelling for High-Dimensional Time Series: Inference and Model Selection
- Author
-
Chun Yip Yau, Ye Lu, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Series (mathematics) ,Applied Mathematics ,Model selection ,05 social sciences ,01 natural sciences ,010104 statistics & probability ,Autocovariance ,Matrix (mathematics) ,Rate of convergence ,Dimension (vector space) ,0502 economics and business ,Statistics ,Statistical inference ,Applied mathematics ,0101 mathematics ,Statistics, Probability and Uncertainty ,Time series ,050205 econometrics ,Mathematics - Abstract
Analysis of high-dimensional time series data is of increasing interest among different fields. This article studies high-dimensional time series from a dimension reduction perspective using factor modelling. Statistical inference is conducted using eigen-analysis of a certain non-negative definite matrix related to autocovariance matrices of the time series, which is applicable to fixed or increasing dimension. When the dimension goes to infinity, the rate of convergence and limiting distributions of estimated factors are established. Using the limiting distributions of estimated factors, a high-dimensional final prediction error criterion is proposed to select the number of factors. Asymptotic properties of the criterion are illustrated by simulation studies and real applications.
- Published
- 2016
27. Modeling and Forecasting Online Auction Prices: A Semiparametric Regression Analysis
- Author
-
Wei Wei Liu and Ngai Hang Chan
- Subjects
Process (engineering) ,Estimation theory ,Research areas ,Strategy and Management ,05 social sciences ,TheoryofComputation_GENERAL ,Inference ,Management Science and Operations Research ,Bidding ,01 natural sciences ,Computer Science Applications ,Online auction ,010104 statistics & probability ,Modeling and Simulation ,0502 economics and business ,Econometrics ,Economics ,Common value auction ,Semiparametric regression ,050207 economics ,0101 mathematics ,Statistics, Probability and Uncertainty - Abstract
Interest in online auctions has been growing in recent years. There is an extensive literature on this topic, whereas modeling online auction price process constitutes one of the most active research areas. Most of the research, however, only focuses on modeling price curves, ignoring the bidding process. In this paper, a semiparametric regression model is proposed to model the online auction process. This model captures two main features of online auction data: changing arrival rates of bidding processes and changing dynamics of prices. A new inference procedure using B-splines is also established for parameter estimation. The proposed model is used to forecast the price of an online auction. The advantage of this proposed approach is that the price can be forecast dynamically and the prediction can be updated according to newly arriving information. The model is applied to Xbox data with satisfactory forecasting properties. Copyright © 2016 John Wiley & Sons, Ltd.
- Published
- 2016
28. Bartlett Correction of Empirical Likelihood for Non‐Gaussian Short‐Memory Time Series
- Author
-
Kun Chen, Chun Yip Yau, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Statistics::Theory ,Series (mathematics) ,Applied Mathematics ,Gaussian ,05 social sciences ,Variance (accounting) ,Edgeworth series ,01 natural sciences ,Zero (linguistics) ,010104 statistics & probability ,symbols.namesake ,Empirical likelihood ,0502 economics and business ,Statistics ,Econometrics ,symbols ,Statistics::Methodology ,Bartlett's method ,0101 mathematics ,Statistics, Probability and Uncertainty ,Gaussian process ,050205 econometrics ,Mathematics - Abstract
Bartlett correction, which improves the coverage accuracies of confidence regions, is one of the desirable features of empirical likelihood. For empirical likelihood with dependent data, previous studies on the Bartlett correction are mainly concerned with Gaussian processes. By establishing the validity of Edgeworth expansion for the signed root empirical log-likelihood ratio statistics, we show that the Bartlett correction is applicable to empirical likelihood for short-memory time series with possibly non-Gaussian innovations. The Bartlett correction is established under the assumptions that the variance of the innovation is known and the mean of the underlying process is zero for a single parameter model. In particular, the order of the coverage errors of Bartlett-corrected confidence regions can be reduced from O(n−1) to O(n−2).
- Published
- 2016
29. LASSO estimation of threshold autoregressive models
- Author
-
Rongmao Zhang, Ngai Hang Chan, and Chun Yip Yau
- Subjects
Estimation ,Economics and Econometrics ,Autoregressive model ,Lasso (statistics) ,Rate of convergence ,Applied Mathematics ,Computation ,Statistics ,Applied mathematics ,Feature selection ,Context (language use) ,Regression ,Mathematics - Abstract
This paper develops a novel approach for estimating a threshold autoregressive (TAR) model with multiple-regimes and establishes its large sample properties. By reframing the problem in a regression variable selection context, a least absolute shrinkage and selection operator (LASSO) procedure is proposed to estimate a TAR model with an unknown number of thresholds, where the computation can be performed efficiently. It is further shown that the number and the location of the thresholds can be consistently estimated. A near optimal convergence rate of the threshold parameters is also established. Simulation studies are conducted to assess the performance in finite samples. The results are illustrated with an application to the quarterly US real GNP data over the period 1947–2009.
- Published
- 2015
30. MARKOWITZ PORTFOLIO AND THE BLUR OF HISTORY
- Author
-
Ngai Hang Chan, Chi Tim Ng, and Yue Shi
- Subjects
050208 finance ,05 social sciences ,Mathematics::Optimization and Control ,01 natural sciences ,010104 statistics & probability ,Computer Science::Computational Engineering, Finance, and Science ,0502 economics and business ,Arbitrage pricing theory ,Econometrics ,Mean vector ,Portfolio ,0101 mathematics ,General Economics, Econometrics and Finance ,Finance ,Mathematics - Abstract
It is shown in this paper that when the true mean return vector is replaced by the inferred mean vector obtained indirectly from factor model and arbitrage pricing theory, its impact on the resulting optimal portfolio is insignificant. To achieve this goal, several assumptions are imposed: (i) the asset returns are generated from a factor model, (ii) the number of assets goes to infinity, and (iii) there is no asymptotic arbitrage opportunities. Issues related to the efficiency of the estimated optimal portfolio for high-frequency data are discussed. The portfolio constructed using the sample mean vector and using the inferred mean vector from arbitrage pricing theory are compared.
- Published
- 2020
31. Nearly Unstable Processes: A Prediction Perspective
- Author
-
Ching-Kang Ing, Ngai Hang Chan, and Rongmao Zhang
- Subjects
Statistics and Probability ,010104 statistics & probability ,0502 economics and business ,05 social sciences ,Perspective (graphical) ,0101 mathematics ,Statistics, Probability and Uncertainty ,Neoclassical economics ,01 natural sciences ,050205 econometrics ,Mathematics - Published
- 2018
32. Likelihood Inferences for High-Dimensional Factor Analysis of Time Series With Applications in Finance
- Author
-
Ngai Hang Chan, Chi Tim Ng, and Chun Yip Yau
- Subjects
Statistics and Probability ,Hessian matrix ,Mathematical optimization ,Series (mathematics) ,Score ,Asymptotic distribution ,Matrix decomposition ,symbols.namesake ,Delta method ,symbols ,Discrete Mathematics and Combinatorics ,Applied mathematics ,Statistics, Probability and Uncertainty ,Time series ,Factor analysis ,Mathematics - Abstract
This article investigates likelihood inferences for high-dimensional factor analysis of time series data. We develop a matrix decomposition technique to obtain expressions of the likelihood functions and its derivatives. With such expressions, the traditional delta method that relies heavily on score function and Hessian matrix can be extended to high-dimensional cases. We establish asymptotic theories, including consistency and asymptotic normality. Moreover, fast computational algorithms are developed for estimation. Applications to high-dimensional stock price data and portfolio analysis are discussed. The technical proofs of the asymptotic results and the computer codes are available online.
- Published
- 2015
33. Residual-based test for fractional cointegration
- Author
-
Man Wang, Bin Wang, and Ngai Hang Chan
- Subjects
Economics and Econometrics ,Statistics::Applications ,Cointegration ,Test procedures ,Monte Carlo method ,Residual ,Statistics::Computation ,Normal distribution ,Computer Science::Computational Engineering, Finance, and Science ,Economics ,Test statistic ,Econometrics ,Statistics::Methodology ,Null hypothesis ,Finance - Abstract
By allowing deviations from equilibrium to follow a fractionally integrated process, the notion of fractional cointegration analysis encompasses a wide range of mean-reverting behaviors. For fractional cointegrations, asymptotic theories have been extensively studied, and numerous empirical studies have been conducted in finance and economics. But as far as testing for fractional cointegration is concerned, most of the testing procedures have restrictions on the integration orders of observed time series or integrating error and some tests involve determination of bandwidth. In this paper, a general fractional cointegration model with the observed series and the cointegrating error being fractional processes is considered, and a residual-based testing procedure for fractional cointegration is proposed. Under some regularity conditions, the test statistic has an asymptotic standard normal distribution under the null hypothesis of no fractional cointegration and diverges under the alternatives. This test procedure is easy to implement and works well in finite samples, as reported in a Monte Carlo experiment.
- Published
- 2015
34. Forecasting Online Auctions via Self-Exciting Point Processes
- Author
-
Ngai Hang Chan, Zehang Richard Li, and Chun Yip Yau
- Subjects
Stylized fact ,Operations research ,Financial economics ,Strategy and Management ,TheoryofComputation_GENERAL ,Prediction interval ,Functional data analysis ,Management Science and Operations Research ,Bidding ,Point process ,Computer Science Applications ,Online auction ,Terminal (electronics) ,Modeling and Simulation ,Economics ,Common value auction ,Statistics, Probability and Uncertainty - Abstract
Modeling online auction prices is a popular research topic among statisticians and marketing analysts. Recent research mainly focuses on two directions: one is the functional data analysis (FDA) approach, in which the price–time relationship is modeled by a smooth curve, and the other is the point process approach, which directly models the arrival process of bidders and bids. In this paper, a novel model for the bid arrival process using a self-exciting point process (SEPP) is proposed and applied to forecast auction prices. The FDA and point process approaches are linked together by using functional data analysis technique to describe the intensity of the bid arrival point process. Using the SEPP to model the bid arrival process, many stylized facts in online auction data can be captured. We also develop a simulation-based forecasting procedure using the estimated SEPP intensity and historical bidding increment. In particular, prediction interval for the terminal price of merchandise can be constructed. Applications to eBay auction data of Harry Potter books and Microsoft Xbox show that the SEPP model provides more accurate and more informative forecasting results than traditional methods. Copyright © 2014 John Wiley & Sons, Ltd.
- Published
- 2014
35. Group LASSO for Structural Break Time Series
- Author
-
Rongmao Zhang, Ngai Hang Chan, and Chun Yip Yau
- Subjects
Statistics and Probability ,Discrete mathematics ,Measurable function ,Series (mathematics) ,Autoregressive model ,Lasso (statistics) ,Statistics ,Feature selection ,Context (language use) ,White noise ,Statistics, Probability and Uncertainty ,Unit (ring theory) ,Mathematics - Abstract
Consider a structural break autoregressive (SBAR) process where j = 1, …, m + 1, {t1, …, tm} are change-points, 1 = t0 < t1 < ⋅⋅⋅ < tm + 1 = n + 1, σ( · ) is a measurable function on , and {ϵt} are white noise with unit variance. In practice, the number of change-points m is usually assumed to be known and small, because a large m would involve a huge amount of computational burden for parameters estimation. By reformulating the problem in a variable selection context, the group least absolute shrinkage and selection operator (LASSO) is proposed to estimate an SBAR model when m is unknown. It is shown that both m and the locations of the change-points {t1, …, tm} can be consistently estimated from the data, and the computation can be efficiently performed. An improved practical version that incorporates group LASSO and the stepwise regression variable selection technique are discussed. Simulation studies are conducted to assess the finite sample performance. Supplementary materials for this article are av...
- Published
- 2014
36. EMPIRICAL LIKELIHOOD TEST FOR CAUSALITY OF BIVARIATE AR(1) PROCESSES
- Author
-
Liang Peng, Deyuan Li, and Ngai Hang Chan
- Subjects
Causality (physics) ,Economics and Econometrics ,Empirical likelihood ,Autoregressive model ,Statistics ,Econometrics ,Test statistic ,A priori and a posteriori ,Probability and statistics ,Limit (mathematics) ,Bivariate analysis ,Social Sciences (miscellaneous) ,Mathematics - Abstract
Testing for causality is of critical importance for many econometric applications. For bivariate AR(1) processes, the limit distributions of causality tests based on least squares estimation depend on the presence of nonstationary processes. When nonstationary processes are present, the limit distributions of such tests are usually very complicated, and the full-sample bootstrap method becomes inconsistent as pointed out in Choi (2005, Statistics and Probability Letters 75, 39–48). In this paper, a profile empirical likelihood method is proposed to test for causality. The proposed test statistic is robust against the presence of nonstationary processes in the sense that one does not have to determine the existence of nonstationary processes a priori. Simulation studies confirm that the proposed test statistic works well.
- Published
- 2013
37. Limit theory of quadratic forms of long-memory linear processes with heavy-tailed GARCH innovations
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Statistics and Probability ,Quadratic growth ,Discrete mathematics ,Numerical Analysis ,Sequence ,Mathematical optimization ,Autoregressive conditional heteroskedasticity ,Moving-average model ,Normal distribution ,Quadratic form ,Long memory ,Function composition ,Statistics, Probability and Uncertainty ,Mathematics - Abstract
Let X"t=@?"j"="0^~c"j@e"t"-"j be a moving average process with GARCH (1, 1) innovations {@e"t}. In this paper, the asymptotic behavior of the quadratic form Q"n=@?"j"="1^n@?"s"="1^nb(t-s)X"tX"s is derived when the innovation {@e"t} is a long-memory and heavy-tailed process with tail index @a, where {b(i)} is a sequence of constants. In particular, it is shown that when 1 =4, Q"n has an asymptotic normal distribution. These results not only shed light on the singular behavior of the quadratic forms when both long-memory and heavy-tailed properties are present, but also have applications in the inference for general linear processes driven by heavy-tailed GARCH innovations.
- Published
- 2013
38. Unified asymptotic theory for nearly unstable AR(p) processes
- Author
-
Boris Buchmann and Ngai Hang Chan
- Subjects
Statistics and Probability ,Fractional Brownian motion ,Applied Mathematics ,Least squares ,symbols.namesake ,Fourier transform ,Autoregressive model ,Unit root test ,Modeling and Simulation ,Calculus ,symbols ,Applied mathematics ,Unit root ,Autoregressive integrated moving average ,Martingale (probability theory) ,Mathematics - Abstract
A unified asymptotic theory for nearly unstable higher order autoregressive processes and their least squares estimates is established. A novel version of Jordan’s canonical decomposition with perturbations together with a suitable plug-in principle is proposed to develop the underlying theories. Assumptions are stated in terms of the domain of attraction of partial Fourier transforms. The machinery is applied to recapture some of the classical results with the driving noise being martingale differences. Further, we show how to extend the results to higher order fractional ARIMA models in nearly unstable settings, thereby offering a comprehensive theory to analyse nearly unstable time series.
- Published
- 2013
39. Nonlinear error correction model and multiple-threshold cointegration
- Author
-
Man Wang, Chun Yip Yau, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Error correction model ,010104 statistics & probability ,Nonlinear system ,Cointegration ,0502 economics and business ,05 social sciences ,Econometrics ,0101 mathematics ,Statistics, Probability and Uncertainty ,01 natural sciences ,050205 econometrics ,Mathematics - Published
- 2016
40. Structural model of credit migration
- Author
-
Hoi Ying Wong, Jing Zhao, and Ngai Hang Chan
- Subjects
Statistics and Probability ,Actuarial science ,Capital structure ,Computer science ,business.industry ,Applied Mathematics ,media_common.quotation_subject ,Risk perception ,Computational Mathematics ,Computational Theory and Mathematics ,Perception ,Feature (machine learning) ,Econometrics ,Credit valuation adjustment ,Empirical evidence ,business ,Risk management ,Credit risk ,media_common - Abstract
Credit migrations constitute the building blocks of modern risk management. A firm-specific structural model of credit migration that incorporates the firm's capital structure and the risk perception of rating agencies is proposed. The proposed model employs the notion of distance-to-default, which quantifies default probability. The properties of Brownian excursions play an essential role in the analysis. The proposed model not only allows the derivation of closed-form credit transition probability, but also provides plausible explanations for certain empirical evidence, such as the default probability overlaps in ratings and the slow-to-respond feature of rating agencies. The proposed model is calibrated through simulations and applied to empirical data, which show rating agencies' risk perceptions to be significant. The calibrated model allows calculation of the firm-specific transition probabilities of rated companies.
- Published
- 2012
41. Higher-order asymptotics in finance
- Author
-
Sheung Chi Phillip Yam and Ngai Hang Chan
- Subjects
Statistics and Probability ,Finance ,Asymptotic analysis ,Laplace transform ,business.industry ,Cornish–Fisher expansion ,Dividend ,business ,Mathematical economics ,Value at risk ,Saddle ,Integral method ,Mathematics ,Central limit theorem - Abstract
A primary motivation of higher-order asymptotic statistical analysis is to improve the first-order limiting result in accordance with the celebrated Central Limit Theorem in the sense that a better approximation with higher order accuracy can be attained. In this article, several important tools in asymptotic analysis for obtaining higher-order approximations, including Edgeworth expansions, saddle-point approximations and Laplace integral method, will be revisited together with an introduction of some of their applications in finance. A new result on bounds for the difference between American and European calls on small dividend paying stock is also provided. WIREs Comput Stat 2012, 4:571-587. doi: 10.1002/wics.1234
- Published
- 2012
42. Non-stationary autoregressive processes with infinite variance
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Statistics and Probability ,Sequence ,Series (mathematics) ,Applied Mathematics ,Mathematical analysis ,Estimator ,Domain (mathematical analysis) ,Stable process ,Unit circle ,Autoregressive model ,Statistics ,Statistics, Probability and Uncertainty ,Random variable ,Mathematics - Abstract
Consider an AR(p) process , where {ɛt} is a sequence of i.i.d. random variables lying in the domain of attraction of a stable law with index 0
- Published
- 2012
43. Least squares estimators for nearly unstable processes for functionals of long-memory noises
- Author
-
Wei Wei Liu and Ngai Hang Chan
- Subjects
Statistics and Probability ,Economics and Econometrics ,Hermite polynomials ,Limit distribution ,Process (computing) ,Estimator ,Asymptotic theory (statistics) ,Least squares ,Moving average ,Long memory ,Applied mathematics ,Statistics, Probability and Uncertainty ,Finance ,Mathematics - Abstract
This paper investigates the asymptotic theory of the least squares estimators (LSE) for a long-memory nearly unstable model when the innovation sequences are functionals of moving averages. It is shown that the limit distribution of the LSE is a functional of the Hermite Ornstein–Uhlenbeck process. This result not only generalizes the result of Buchmann and Chan [Ann. Statist. 35 (2007), 2001–2017], but also that of Wu [Economet. Theory 22 (2006), 1–14].
- Published
- 2012
44. TOWARD A UNIFIED INTERVAL ESTIMATION OF AUTOREGRESSIONS
- Author
-
Ngai Hang Chan, Liang Peng, and Deyuan Li
- Subjects
Economics and Econometrics ,Empirical likelihood ,Autoregressive model ,Interval estimation ,Statistics ,Unit root ,Interval (mathematics) ,Limit (mathematics) ,Least squares ,Social Sciences (miscellaneous) ,Confidence interval ,Mathematics - Abstract
An empirical likelihood–based confidence interval is proposed for interval estimations of the autoregressive coefficient of a first-order autoregressive model via weighted score equations. Although the proposed weighted estimate is less efficient than the usual least squares estimate, its asymptotic limit is always normal without assuming stationarity of the process. Unlike the bootstrap method or the least squares procedure, the proposed empirical likelihood–based confidence interval is applicable regardless of whether the underlying autoregressive process is stationary, unit root, near-integrated, or even explosive, thereby providing a unified approach for interval estimation of an AR(1) model to encompass all situations. Finite-sample simulation studies confirm the effectiveness of the proposed method.
- Published
- 2011
45. Maximum likelihood estimation for nearly non-stationary stable autoregressive processes
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Statistics and Probability ,Characteristic function (probability theory) ,Applied Mathematics ,Monte Carlo method ,Least squares ,Stable process ,symbols.namesake ,Autoregressive model ,Statistics ,symbols ,Applied mathematics ,Statistics, Probability and Uncertainty ,Constant (mathematics) ,Random variable ,Gaussian process ,Mathematics - Abstract
The maximum likelihood estimate (MLE) of the autoregressive coefficient of a near-unit root autoregressive process Yt = ?nYt-1 + ?t with a-stable noise {?t} is studied in this paper. Herein ?n = 1 - ?/n, ? = 0 is a constant, Y0 is a fixed random variable and et is an a-stable random variable with characteristic function f(t,?) for some parameter ?. It is shown that when 0 1 and E?1 = 0, the limit distribution of the MLE of ?n and ? are mixtures of a stable process and Gaussian processes. On the other hand, when a > 1 and E?1 ? 0, the limit distribution of the MLE of ?n and ? are normal. A Monte Carlo simulation reveals that the MLE performs better than the usual least squares procedures, particularly for the case when the tail index a is less than 1.
- Published
- 2011
46. Interval estimation of the tail index of a GARCH(1,1) model
- Author
-
Rongmao Zhang, Ngai Hang Chan, and Liang Peng
- Subjects
Statistics and Probability ,Moment (mathematics) ,Delta method ,Empirical likelihood ,Autoregressive conditional heteroskedasticity ,Interval estimation ,Statistics ,Estimator ,Sample (statistics) ,Statistics, Probability and Uncertainty ,Confidence interval ,Mathematics - Abstract
It is known that the tail index of a GARCH model is determined by a moment equation, which involves the underlying unknown parameters of the model. A tail index estimator can therefore be constructed by solving the sample moment equation with the unknown parameters being replaced by its quasi-maximum likelihood estimates (QMLE). To construct a confidence interval for the tail index, one needs to estimate the non-trivial asymptotic variance of the QMLE. In this paper, an empirical likelihood method is proposed for interval estimation of the tail index. One advantage of the proposed method is that interval estimation can still be achieved without having to estimate the complicated asymptotic variance. A simulation study confirms the advantage of the proposed method.
- Published
- 2011
47. Quantile inference for heteroscedastic regression models
- Author
-
Rongmao Zhang and Ngai Hang Chan
- Subjects
Statistics and Probability ,Statistics::Theory ,Heteroscedasticity ,Applied Mathematics ,Estimator ,Conditional probability distribution ,Quantile function ,Quantile regression ,Empirical likelihood ,Statistics ,Statistics::Methodology ,Statistics, Probability and Uncertainty ,Mathematics ,Variance function ,Quantile - Abstract
Consider the nonparametric heteroscedastic regression model Y = m ( X ) + σ ( X ) ɛ , where m ( · ) is an unknown conditional mean function and σ ( · ) is an unknown conditional scale function. In this paper, the limit distribution of the quantile estimate for the scale function σ ( X ) is derived. Since the limit distribution depends on the unknown density of the errors, an empirical likelihood ratio statistic based on quantile estimator is proposed. This statistics is used to construct confidence intervals for the variance function. Under certain regularity conditions, it is shown that the quantile estimate of the scale function converges to a Brownian motion and the empirical likelihood ratio statistic converges to a chi-squared random variable. Simulation results demonstrate the superiority of the proposed method over the least squares procedure when the underlying errors have heavy tails.
- Published
- 2011
48. A note on asymptotic inference for FIGARCH($p, d, q$) models
- Author
-
Chi Tim Ng and Ngai Hang Chan
- Subjects
Statistics and Probability ,Discrete mathematics ,Mathematical optimization ,Consistency (statistics) ,Applied Mathematics ,Autoregressive conditional heteroskedasticity ,Asymptotic distribution ,Estimator ,Inference ,Parameter space ,Likelihood function ,Stationary point ,Mathematics - Abstract
Parameters estimation for a FIGARCH(p, d, q )m odel is studied in this paper. By constructing a compact parameter space Θ satisfying the non-negativity constraints for the FI- GARCH model, it is shown that the results of Robinson and Zaffaroni (2006) can be applied to establish the strong con- sistency and asymptotic normality of the quasi-maximum likelihood (QML) estimator of the FIGARCH model. AMS 2000 subject classifications: Primary 62F12, 62E20; secondary 91B84. Keywords and phrases: Asymptotic normality, Con- sistency, Fractionally-integrated GARCH model, Non- negativity, Quasi-maximum likelihood estimator. To apply the results of Robinson and Zaffaroni (2006), a compact searching region Θ satisfying the assumption NN has to be constructed. In this note, we show under certain conditions that the assumptions A-H of Robinson and Zaf- faroni (2006) are fulfilled for θ ∈ Θ. In this way, the asymp- totic behavior of the QMLE of the FIGARCH models within Θ can then be directly established. It should be noted that due to the difficulties of explicitly expressing Θ, in practice, we have to search for the stationary points of the quasi-log likelihood function globally. The link between these station- ary points and the QMLE in Θ is furnished in Proposition 2. Throughout this paper, the assumptions A to H of Robin- son and Zaffaroni (2006) are denoted by RZ-A to RZ-H. Consider the FIGARCH model, X 2 t = σ 2 � 2
- Published
- 2011
49. Residual empirical processes for nearly unstable long-memory time series
- Author
-
Wei Wei Liu and Ngai Hang Chan
- Subjects
Statistics and Probability ,Fractional Brownian motion ,Series (mathematics) ,Autoregressive model ,Econometrics ,Unit root ,Statistical physics ,Residual ,Bayesian inference ,Empirical process ,Statistical hypothesis testing ,Mathematics - Abstract
This paper studies the goodness-of-fit test of the residual empirical process of a nearly unstable long-memory time series. Chan and Ling (2008) showed that the usual limit distribution of the Kolmogorov–Smirnov test statistics does not hold for an unstable autoregressive model. A key question of interest is what happens when this model has a near unit root, that is, when it is nearly unstable. In this paper, it is established that the statistics proposed by Chan and Ling can be generalized to encompass nearly unstable long-memory models. In particular, the limit distribution is expressed as a functional of an Ornstein–Uhlenbeck process that is driven by a fractional Brownian motion. Simulation studies demonstrate that the limit distribution of the statistic possesses desirable finite sample properties and power.
- Published
- 2010
50. EMPIRICAL-LIKELIHOOD-BASED CONFIDENCE INTERVALS FOR CONDITIONAL VARIANCE IN HETEROSKEDASTIC REGRESSION MODELS
- Author
-
Dabao Zhang, Liang Peng, and Ngai Hang Chan
- Subjects
One-way analysis of variance ,Economics and Econometrics ,Heteroscedasticity ,Empirical likelihood ,Autoregressive conditional heteroskedasticity ,Statistics ,Econometrics ,Variance reduction ,Law of total variance ,Conditional variance ,Social Sciences (miscellaneous) ,Variance function ,Mathematics - Abstract
Fan and Yao (1998) proposed an efficient method to estimate the conditional variance of heteroskedastic regression models. Chen, Cheng, and Peng (2009) applied variance reduction techniques to the estimator of Fan and Yao (1998) and proposed a new estimator for conditional variance to account for the skewness of financial data. In this paper, we apply empirical likelihood methods to construct confidence intervals for the conditional variance based on the estimator of Fan and Yao (1998) and the reduced variance modification of Chen et al. (2009). Simulation studies and data analysis demonstrate the advantage of the empirical likelihood method over the normal approximation method.
- Published
- 2010
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.