EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Large Volatility Matrix Inference Based on High frequency Financial Data

Download or read book Large Volatility Matrix Inference Based on High frequency Financial Data written by and published by . This book was released on 2013 with total page 142 pages. Available in PDF, EPUB and Kindle. Book excerpt: Financial practices often need to estimate an integrated volatility matrix of a large number of assets using noisy high-frequency financial data. This estimation problem is a challenging one for four reasons: (1) high-frequency financial data are discrete observations of the underlying assets' price processes; (2) due to market micro-structure noise, high-frequency data are observed with measurement errors; (3) different assets are traded at different time points, which is the so-called non-synchronization phenomenon in high-frequency financial data; (4) the number of assets may be comparable to or even exceed the observations, and hence many existing estimators of small size volatility matrices become inconsistent when the size of the matrix is close to or larger than the sample size. In this dissertation, we focus on large volatility matrix inference for high-frequency financial data, which can be summarized in three aspects. On the methodological aspect, we propose a new threshold MSRVM estimator of large volatility matrix. This estimator can deal with all the four challenges, and is consistent when both sample size and matrix size go to infinity. On the theoretical aspect, we study the optimal convergence rate for the volatility matrix estimation, by building the asymptotic theory for the proposed estimator and deriving a minimax lower bound for this estimation problem. The proposed threshold MSRVM estimator has a risk matching with the lower bound up to a constant factor, and hence it achieves an optimal convergence rate. As for the applications, we develop a novel approach to predict the volatility matrix. The approach extends the applicability of classical low-frequency models such as matrix factor models and vector autoregressive models to the high-frequency data. With this approach, we pool together the strengths of both classical low-frequency models and new high-frequency estimation methodologies. Furthermore, numerical studies are conducted to test the finite sample performance of the proposed estimators, to support the established asymptotic theories.

Book Statistical Inferences on High frequency Financial Data and Quantum State Tomography

Download or read book Statistical Inferences on High frequency Financial Data and Quantum State Tomography written by Donggyu Kim and published by . This book was released on 2016 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: In this dissertation, we study two topics, the volatility analysis based on the high-frequency financial data and quantum state tomography. In Part I, we study the volatility analysis based on the high-frequency financial data. We first investigate how to estimate large volatility matrices effectively and efficiently. For example, we introduce threshold rules to regularize kernel realized volatility, pre-averaging realized volatility, and multi-scale realized volatility. Their convergence rates are derived under sparsity on the large integrated volatility matrix. To account for the sparse structure well, we employ the factor-based Itô processes and under the proposed factor-based model, we develop an estimation scheme called "blocking and regularizing". Also, we establish a minimax lower bound for the eigenspace estimation problem and propose sparse principal subspace estimation methods by using the multi-scale realized volatility matrix estimator or the pre-averaging realized volatility matrix estimator. Finally, we introduce a unified model, which can accommodate both continuous-time Itô processes used to model high-frequency stock prices and GARCH processes employed to model low-frequency stock prices, by embedding a discrete-time GARCH volatility in its continuous-time instantaneous volatility. We adopt realized volatility estimators based on high-frequency financial data and the quasi-likelihood function for the low-frequency GARCH structure to develop parameter estimation methods for the combined high-frequency and low-frequency data. In Part II, we study the quantum state tomography with Pauli measurements. In the quantum science, the dimension of the quantum density matrix usually grows exponentially with the size of the quantum system, and thus it is important to develop effective and efficient estimation methods for the large quantum density matrices. We study large density matrix estimation methods and obtain the minimax lower bound under some sparse structures, for example, (i) the coefficients of the density matrix with respect to the Pauli basis are sparse; (ii) the rank is low; (iii) the eigenvectors are sparse. Their performances may depend on the sparse structure, and so it is essential to choose appropriate estimation methods according to the sparse structure. In light of this, we study how to conduct hypothesis tests for the sparse structure. Specifically, we propose hypothesis test procedures and develop central limit theorems for each test statistics. A simulation study is conducted to check the finite sample performances of proposed estimation methods and hypothesis tests.

Book Statistical Inferences on High frequency Financial Data and Quantum State Tomography

Download or read book Statistical Inferences on High frequency Financial Data and Quantum State Tomography written by and published by . This book was released on 2016 with total page 368 pages. Available in PDF, EPUB and Kindle. Book excerpt: In this dissertation, we study two topics, the volatility analysis based on the high-frequency financial data and quantum state tomography. In Part I, we study the volatility analysis based on the high-frequency financial data. We first investigate how to estimate large volatility matrices effectively and efficiently. For example, we introduce threshold rules to regularize kernel realized volatility, pre-averaging realized volatility, and multi-scale realized volatility. Their convergence rates are derived under sparsity on the large integrated volatility matrix. To account for the sparse structure well, we employ the factor-based Itô processes and under the proposed factor-based model, we develop an estimation scheme called “blocking and regularizing". Also, we establish a minimax lower bound for the eigenspace estimation problem and propose sparse principal subspace estimation methods by using the multi-scale realized volatility matrix estimator or the pre-averaging realized volatility matrix estimator. Finally, we introduce a unified model, which can accommodate both continuous-time Itô processes used to model high-frequency stock prices and GARCH processes employed to model low-frequency stock prices, by embedding a discrete-time GARCH volatility in its continuous-time instantaneous volatility. We adopt realized volatility estimators based on high-frequency financial data and the quasi-likelihood function for the low-frequency GARCH structure to develop parameter estimation methods for the combined high-frequency and low-frequency data. In Part II, we study the quantum state tomography with Pauli measurements. In the quantum science, the dimension of the quantum density matrix usually grows exponentially with the size of the quantum system, and thus it is important to develop effective and efficient estimation methods for the large quantum density matrices. We study large density matrix estimation methods and obtain the minimax lower bound under some sparse structures, for example, (i) the coefficients of the density matrix with respect to the Pauli basis are sparse; (ii) the rank is low; (iii) the eigenvectors are sparse. Their performances may depend on the sparse structure, and so it is essential to choose appropriate estimation methods according to the sparse structure. In light of this, we study how to conduct hypothesis tests for the sparse structure. Specifically, we propose hypothesis test procedures and develop central limit theorems for each test statistics. A simulation study is conducted to check the finite sample performances of proposed estimation methods and hypothesis tests.

Book High Frequency Financial Econometrics

Download or read book High Frequency Financial Econometrics written by Yacine Aït-Sahalia and published by Princeton University Press. This book was released on 2014-07-21 with total page 683 pages. Available in PDF, EPUB and Kindle. Book excerpt: A comprehensive introduction to the statistical and econometric methods for analyzing high-frequency financial data High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. Yacine Aït-Sahalia and Jean Jacod cover the mathematical foundations of stochastic processes, describe the primary characteristics of high-frequency financial data, and present the asymptotic concepts that their analysis relies on. Aït-Sahalia and Jacod also deal with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As they demonstrate, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. Aït-Sahalia and Jacod approach high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.

Book Structured Volatility Matrix Estimation for Non Synchronized High Frequency Financial Data

Download or read book Structured Volatility Matrix Estimation for Non Synchronized High Frequency Financial Data written by Jianqing Fan and published by . This book was released on 2017 with total page 35 pages. Available in PDF, EPUB and Kindle. Book excerpt: Recently several large volatility matrix estimation procedures have been developed for factor-based Ito processes whose integrated volatility matrix consists of low-rank and sparse matrices. Their performance depends on the accuracy of input volatility matrix estimators. When estimating co-volatilities based on high-frequency data, one of the crucial challenges is non-synchronization for illiquid assets, which makes their co-volatility estimators inaccurate. In this paper, we study how to estimate the large integrated volatility matrix without using co-volatilities of illiquid assets. Specifically, we pretend that the co-volatilities for illiquid assets are missing, and estimate the low-rank matrix using a matrix completion scheme with a structured missing pattern. To further regularize the sparse volatility matrix, we employ the principal orthogonal complement thresholding method (POET). We also investigate the asymptotic properties of the proposed estimation procedure and demonstrate its advantages over using co-volatilities of illiquid assets. The advantages of our methods are also verified by an extensive simulation study and illustrated by high-frequency financial data for constituents of the S&P 500 index.

Book Robust High Dimensional Volatility Matrix Estimation for High Frequency Factor Model

Download or read book Robust High Dimensional Volatility Matrix Estimation for High Frequency Factor Model written by Jianqing Fan and published by . This book was released on 2017 with total page 42 pages. Available in PDF, EPUB and Kindle. Book excerpt: High-frequency financial data allow us to estimate large volatility matrices with relatively short time horizon. Many novel statistical methods have been introduced to address large volatility matrix estimation problems from a high-dimensional Ito process with microstructural noise contamination. Their asymptotic theories require sub-Gaussian or some finite high-order moments assumptions for observed log-returns. These assumptions are at odd with the heavy tail phenomenon that is pandemic in financial stock returns and new procedures are needed to mitigate the influence of heavy tails. In this paper, we introduce the Huber loss function with a diverging threshold to develop a robust realized volatility estimation. We show that it has the sub-Gaussian concentration around the volatility with only finite fourth moments of observed log-returns. With the proposed robust estimator as input, we further regularize it by using the principal orthogonal component thresholding (POET) procedure to estimate the large volatility matrix that admits an approximate factor structure. We establish the asymptotic theories for such low-rank plus sparse matrices. The simulation study is conducted to check the finite sample performance of the proposed estimation methods.

Book A Comparative Study on Large Multivariate Volatility Matrix Modeling for High frequency Financial Data

Download or read book A Comparative Study on Large Multivariate Volatility Matrix Modeling for High frequency Financial Data written by Dongchen Jiang and published by . This book was released on 2015 with total page 80 pages. Available in PDF, EPUB and Kindle. Book excerpt: Abstract: Modeling and forecasting the volatilities of high-frequency data observed on the prices of financial assets are vibrant research areas in econometrics and statistics. However, most of the available methods are not directly applicable when the number of assets involved is large, due to the lack of accuracy in estimating high-dimensional matrices. This paper compared two methodologies of vast volatility matrix estimation for high-frequency data. One is to estimate the Average Realized Volatility Matrix and to regularize it by banding and thresholding. In this method, first we select grids as pre-sampling frequencies, construct a realized volatility matrix using previous tick method according to each pre-sampling frequency and then take the average of the constructed realized volatility matrices as the stage one estimator, which we call the ARVM estimator. Then we regularize the ARVM estimator to yield good consistent estimators of the large integrated volatility matrix. We consider two regularizations: thresholding and banding. The other is Dynamic Conditional Correlation (DCC) which can be estimated for two stage, where in the rst stage univariate GARCH models are estimated for each residual series, and in the second stage, the residuals are used to estimate the parameters of the dynamic correlation. Asymptotic theory for the two proposed methodologies shows that the estimator are consistent. In numerical studies, the proposed two methodologies are applied to simulated data set and real high-frequency prices from top 100 S & P 500 stocks according to the trading volume over a period of 3 months, 64 trading days in 2013. From the perfomances of estimators, the conclusion is that TARVM estimator performs better than DCC volatility matrix. And its largest eigenvalues are more stable than those of DCC model so that it is more approriable in eigen-based anaylsis.

Book Volatility Estimation with Financial Data

Download or read book Volatility Estimation with Financial Data written by and published by . This book was released on 2015 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Modeling and estimating volatility plays a crucial role in financial practice. Devoted efforts are made to investigate this topic using both low-frequency and high-frequency financial data. Traditionally, volatility modeling and analysis are based on either historical price data or option data. Finance theory shows that option prices heavily depend on the underlying stocks' prices, and thus the two kinds of data are related. This thesis explores the approach that combines both stock price data and option data to perform the statistical analysis of volatility. We investigate the Black-Scholes model and an exponential GARCH model and derive the relationship among the Fisher information for volatility estimation based on stock price data alone or option data alone as well as joint volatility estimation for combining stock price data and option data. Under the Block-Scholes model, asymptotic theory for the joint estimation is established, and a simulation study was conducted to check finite sample performances of the proposed joint estimator. Being more accessible than ever, high-frequency data have provided researchers and practitioners with incredible tools to investigate assets pricing and market dynamics. Non-synchronous observations, microstructure noise, and complex pricing models are challenges coming along with high-frequency data. Moreover, large volatility matrix estimation is involved in many finance practices and encounters "curse of dimensionality". Although it is widely used in large covariance estimation, imposing sparsity assumption on the entire volatility matrix is not reasonable in financial practice. In fact, due to the existence of common factors, assets are widely correlated with each other and their volatility matrix is not sparse. In this thesis, we focus on incorporating the factor influence in asset price modeling and volatility matrix estimation. We propose to model asset price using a factor-based diffusion process. The idea is that assets' prices are governed by a common factor, and that assets with similar characteristics share the same association with the factor. Under the proposed factor-based model, we developed an estimation scheme called "Blocking and Regularizing", which deals with all of the four changeless. The asymptotic properties of the proposed estimator are studied, while its finite sample performance is tested via extensive numerical studies to support theoretical results

Book Discretization of Processes

Download or read book Discretization of Processes written by Jean Jacod and published by Springer Science & Business Media. This book was released on 2011-10-22 with total page 596 pages. Available in PDF, EPUB and Kindle. Book excerpt: In applications, and especially in mathematical finance, random time-dependent events are often modeled as stochastic processes. Assumptions are made about the structure of such processes, and serious researchers will want to justify those assumptions through the use of data. As statisticians are wont to say, “In God we trust; all others must bring data.” This book establishes the theory of how to go about estimating not just scalar parameters about a proposed model, but also the underlying structure of the model itself. Classic statistical tools are used: the law of large numbers, and the central limit theorem. Researchers have recently developed creative and original methods to use these tools in sophisticated (but highly technical) ways to reveal new details about the underlying structure. For the first time in book form, the authors present these latest techniques, based on research from the last 10 years. They include new findings. This book will be of special interest to researchers, combining the theory of mathematical finance with its investigation using market data, and it will also prove to be useful in a broad range of applications, such as to mathematical biology, chemical engineering, and physics.

Book Large Volatility Matrix Analysis Using Global and National Factor Models

Download or read book Large Volatility Matrix Analysis Using Global and National Factor Models written by Sung Hoon Choi and published by . This book was released on 2022 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Several large volatility matrix inference procedures have been developed, based on the latent factor model. They often assumed that there are a few of common factors, which can account for volatility dynamics. However, several studies have demonstrated the presence of local factors. In particular, when analyzing the global stock market, we often observe that nation-specific factors explain their own country's volatility dynamics. To account for this, we propose the Double Principal Orthogonal complEment Thresholding (Double-POET) method, based on multi-level factor models, and also establish its asymptotic properties. Furthermore, we demonstrate the drawback of using the regular principal orthogonal component thresholding (POET) when the local factor structure exists. We also describe the blessing of dimensionality using Double-POET for local covariance matrix estimation. Finally, we investigate the performance of the Double-POET estimator in an out-of-sample portfolio allocation study using international stocks from 20 financial markets.

Book Inference from High frequency Data

Download or read book Inference from High frequency Data written by and published by . This book was released on 2015 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Factor GARCH ITO Models for High Frequency Data with Application to Large Volatility Matrix Prediction

Download or read book Factor GARCH ITO Models for High Frequency Data with Application to Large Volatility Matrix Prediction written by Donggyu Kim and published by . This book was released on 2017 with total page 41 pages. Available in PDF, EPUB and Kindle. Book excerpt: Several novel large volatility matrix estimation methods have been developed based on the high-frequency financial data. They often employ the approximate factor model that leads to a low-rank plus sparse structure for the integrated volatility matrix and facilitates estimation of large volatility matrices. However, for predicting future volatility matrices, these nonparametric estimators do not have a dynamic structure to implement. In this paper, we introduce a novel Ito diffusion process based on the approximate factor models and call it a factor GARCH-Ito model. We then investigate its properties and propose a quasi-maximum likelihood estimation method for the parameter of the factor GARCH-Ito model. We also apply it to estimating conditional expected large volatility matrices and establish their asymptotic properties. Simulation studies are conducted to validate the finite sample performance of the proposed estimation methods. The proposed method is also illustrated by using data from the constituents of the S&P 500 index and an application to constructing the minimum variance portfolio with gross exposure constraints.

Book Big Data

    Book Details:
  • Author : Kuan-Ching Li
  • Publisher : CRC Press
  • Release : 2015-09-15
  • ISBN : 1498760406
  • Pages : 444 pages

Download or read book Big Data written by Kuan-Ching Li and published by CRC Press. This book was released on 2015-09-15 with total page 444 pages. Available in PDF, EPUB and Kindle. Book excerpt: As today’s organizations are capturing exponentially larger amounts of data than ever, now is the time for organizations to rethink how they digest that data. Through advanced algorithms and analytics techniques, organizations can harness this data, discover hidden patterns, and use the newly acquired knowledge to achieve competitive advantages. Presenting the contributions of leading experts in their respective fields, Big Data: Algorithms, Analytics, and Applications bridges the gap between the vastness of Big Data and the appropriate computational methods for scientific and social discovery. It covers fundamental issues about Big Data, including efficient algorithmic methods to process data, better analytical strategies to digest data, and representative applications in diverse fields, such as medicine, science, and engineering. The book is organized into five main sections: Big Data Management—considers the research issues related to the management of Big Data, including indexing and scalability aspects Big Data Processing—addresses the problem of processing Big Data across a wide range of resource-intensive computational settings Big Data Stream Techniques and Algorithms—explores research issues regarding the management and mining of Big Data in streaming environments Big Data Privacy—focuses on models, techniques, and algorithms for preserving Big Data privacy Big Data Applications—illustrates practical applications of Big Data across several domains, including finance, multimedia tools, biometrics, and satellite Big Data processing Overall, the book reports on state-of-the-art studies and achievements in algorithms, analytics, and applications of Big Data. It provides readers with the basis for further efforts in this challenging scientific field that will play a leading role in next-generation database, data warehousing, data mining, and cloud computing research. It also explores related applications in diverse sectors, covering technologies for media/data communication, elastic media/data storage, cross-network media/data fusion, and SaaS.

Book High Frequency Data  Frequency Domain Inference and Volatility Forecasting

Download or read book High Frequency Data Frequency Domain Inference and Volatility Forecasting written by Jonathan H. Wright and published by . This book was released on 1999 with total page 38 pages. Available in PDF, EPUB and Kindle. Book excerpt: While it is clear that the volatility of asset returns is serially correlated, there is no general agreement as to the most appropriate parametric model for characterizing this temporal dependence. In this paper, we propose a simple way of modeling financial market volatility using high frequency data. The method avoids using a tight parametric model, by instead simply fitting a long autoregression to log-squared, squared or absolute high frequency returns. This can either be estimated by the usual time domain method, or alternatively the autoregressive coefficients can be backed out from the smoothed periodogram estimate of the spectrum of log-squared, squared or absolute returns. We show how this approach can be used to construct volatility forecasts, which compare favorably with some leading alternatives in an out-of-sample forecasting exercise.