Download or read book Robust Covariance Matrix Estimation with Data dependent VAR Prewhitening Order written by Wouter J. Den Haan and published by . This book was released on 2000 with total page 56 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper analyzes the performance of heteroskedasticity-and-autocorrelation-consistent (HAC) covariance matrix estimators in which the residuals are prewhitened using a vector autoregressive (VAR) filter. We highlight the pitfalls of using an arbitrarily fixed lag order for the VAR filter, and we demonstrate the benefits of using a model selection criterion (either AIC or BIC) to determine its lag structure. Furthermore, once data-dependent VAR prewhitening has been utilized, we find negligible or even counter-productive effects of applying standard kernel-based methods to the prewhitened residuals; that is, the performance of the prewhitened kernel estimator is virtually indistinguishable from that of the VARHAC estimator.
Download or read book Applied Economic Forecasting Using Time Series Methods written by Eric Ghysels and published by Oxford University Press. This book was released on 2018 with total page 617 pages. Available in PDF, EPUB and Kindle. Book excerpt: Economic forecasting is a key ingredient of decision making in the public and private sectors. This book provides the necessary tools to solve real-world forecasting problems using time-series methods. It targets undergraduate and graduate students as well as researchers in public and private institutions interested in applied economic forecasting.
Download or read book Handbook of Economic Forecasting written by G. Elliott and published by Elsevier. This book was released on 2006-05-30 with total page 1071 pages. Available in PDF, EPUB and Kindle. Book excerpt: Research on forecasting methods has made important progress over recent years and these developments are brought together in the Handbook of Economic Forecasting. The handbook covers developments in how forecasts are constructed based on multivariate time-series models, dynamic factor models, nonlinear models and combination methods. The handbook also includes chapters on forecast evaluation, including evaluation of point forecasts and probability forecasts and contains chapters on survey forecasts and volatility forecasts. Areas of applications of forecasts covered in the handbook include economics, finance and marketing.*Addresses economic forecasting methodology, forecasting models, forecasting with different data structures, and the applications of forecasting methods *Insights within this volume can be applied to economics, finance and marketing disciplines
Download or read book A Practitioner s Guide to Robust Covariance Matrix Estimation written by Wouter J. Den Haan and published by . This book was released on 1996 with total page 72 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper develops asymptotic distribution theory for generalized method of moments (GMM) estimators and test statistics when some of the parameters are well identified, but others are poorly identified because of weak instruments. The asymptotic theory entails applying empirical process theory to obtain a limiting representation of the (concentrated) objective function as a stochastic process. The general results are specialized to two leading cases, linear instrumental variables regression and GMM estimation of Euler equations obtained from the consumption-based capital asset pricing model with power utility. Numerical results of the latter model confirm that finite sample distributions can deviate substantially from normality, and indicate that these deviations are captured by the weak instruments asymptotic approximations.
Download or read book Three Essays in International Finance written by Byong-Ju Lee and published by Stanford University. This book was released on 2011 with total page 132 pages. Available in PDF, EPUB and Kindle. Book excerpt: This thesis consists of three essays on international finance. The first essay is "Exchange rates and Fundamentals". A new open interest rate parity condition that takes account of economic fundamentals is developed from stochastic discount factors (SDFs) of two countries. Through this parity condition, business cycles or fundamentals are linked to exchange rates. Key empirical findings from this parity condition are as follows. First, this model beats the random walk hypothesis: economic fundamentals explain exchange rate movements for high interest rate currencies. Exchange rates of low interest rate currencies act like a random walk because they are less correlated with fundamentals owing to their low risk. For example, U.S. business cycles explain the direction of changes in exchange rates against the dollar. The same thing is true for Japan. Second, this model resolves the forward premium puzzle: the forward premium puzzle is not a general characteristic as regarded in previous studies. It happens when the risk awareness of investors is low, during economic expansions and for low risk currencies. The second essay is "Carry Trade and Global Financial Instability". Carry trade, an opportunistic investment strategy that takes advantage of interest rate differential across countries, is identified the cause of the large-scale depreciations of peripheral currencies in the later half of 2008. A simultaneous equations model, which is derived from a conceptual partial equilibrium model for a local foreign exchange market, is estimated from a cross-sectional sample. The results suggest that the larger appreciation of the yen than the dollar was brought about by a lack of the local supply of the yen rather than a more severe crunch of yen credits. The third essay is "The Economic Origin of Letters of Credit". This essay discusses the economic origin of letters of credit, an instrument widely used in international trade. A game theoretical analysis shows that letters of credit improve efficiency in trade settlements, increasing returns in trade. A few notable facts on letters of credit are discussed. First, the new institution is adopted by merchant banks to maximize their profits and in the process, an improvement in efficiency of international transactions is obtained. Second, the organization established by the legacy institution, bills of exchange, played a critical role in adopting the new institution. Third, the legal enforcement is not essential in this economic institution. Finally, two drivers are identified that improve efficiency of transactions: concentration and projection.
Download or read book Divergences in Productivity Between Europe and the United States written by Gilbert Cette and published by Edward Elgar Publishing. This book was released on 2007-01-01 with total page 272 pages. Available in PDF, EPUB and Kindle. Book excerpt: Papers from a seminar held at the Royaumont Abbey on 22 and 23 March 2004, and organized by the Banque de France, CEPII, and the Ifo Institute for Economic Research at the University of Munich.
Download or read book Demand Estimation with Heterogeneous Consumers and Unobserved Product Characteristics written by Patrick Bajari and published by . This book was released on 2001 with total page 80 pages. Available in PDF, EPUB and Kindle. Book excerpt: We study the identification and estimation of preferences in hedonic discrete choice models of demand for differentiated products. In the hedonic discrete choice model, products are represented as a finite dimensional bundle of characteristics, and consumers maximize utility subject to a budget constraint. Our hedonic model also incorporates product characteristics that are observed by consumers but not by the economist. We demonstrate that, unlike the case where all product characteristics are observed, it is not in general possible to uniquely recover consumer preferences from data on a consumer's choices. However, we provide several sets of assumptions under which preferences can be recovered uniquely, that we think may be satisfied in many applications. Our identification and estimation strategy is a two stage approach in the spirit of Rosen (1974). In the first stage, we show under some weak conditions that price data can be used to nonparametrically recover the unobserved product characteristics and the hedonic pricing function. In the second stage, we show under some weak conditions that if the product space is continuous and the functional form of utility is known, then there exists an inversion between a consumer's choices and her preference parameters. If the product space is discrete, we propose a Gibbs sampling algorithm to simulate the population distribution of consumers' taste coefficients.
Download or read book Econometric Methods for Endogenously Sampled Time Series written by George J. Hall and published by . This book was released on 2002 with total page 72 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper studies the econometric problems associated with estimation of a stochastic process that is endogenously sampled. Our interest is to infer the law of motion of a discrete-time stochastic process {pt} that is observed only at a subset of times {t1,..., tn} that depend on the outcome of a probabilistic sampling rule that depends on the history of the process as well as other observed covariates xt . We focus on a particular example where pt denotes the daily wholesale price of a standardized steel product. However there are no formal exchanges or centralized markets where steel is traded and pt can be observed. Instead nearly all steel transaction prices are a result of private bilateral negotiations between buyers and sellers, typically intermediated by middlemen known as steel service centers. Even though there is no central record of daily transactions prices in the steel market, we do observe transaction prices for a particular firm -- a steel service center that purchases large quantities of steel in the wholesale market for subsequent resale in the retail market. The endogenous sampling problem arises from the fact that the firm only records pt on the days that it purchases steel. We present a parametric analysis of this problem under the assumption that the timing of steel purchases is part of an optimal trading strategy that maximizes the firm's expected discounted trading profits. We derive a parametric partial information maximum likelihood (PIML) estimator that solves the endogenous sampling problem and efficiently estimates the unknown parameters of a Markov transition probability that determines the law of motion for the underlying {pt} process. The PIML estimator also yields estimates of the structural parameters that determine the optimal trading rule. We also introduce an alternative consistent, less efficient, but computationally simpler simulated minimum distance (SMD) estimator that avoids high dimensional numerical integrations required by the PIML estimator. Using the SMD estimator, we provide estimates of a truncated lognormal AR(1) model of the wholesale price processes for particular types of steel plate. We use this to infer the share of the middleman's discounted profits that are due to markups paid by its retail customers, and the share due to price speculation. The latter measures the firm's success in forecasting steel prices and in timing its purchases in order to buy low and sell high'. The more successful the firm is in speculation (i.e. in strategically timing its purchases), the more serious are the potential biases that would result from failing to account for the endogeneity of the sampling process.
Download or read book The Bias of the RSR Estimator and the Accuracy of Some Alternatives written by William N. Goetzmann and published by . This book was released on 2001 with total page 54 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper analyzes the implications of cross-sectional heteroskedasticity in repeat sales regression (RSR). RSR estimators are essentially geometric averages of individual asset returns because of the logarithmic transformation of price relatives. We show that the cross sectional variance of asset returns affects the magnitude of bias in the average return estimate for that period, while reducing the bias for the surrounding periods. It is not easy to use an approximation method to correct the bias problem. We suggest a maximum-likelihood alternative to the RSR that directly estimates index returns that are analogous to the RSR estimators but are arithmetic averages of individual returns. Simulations show that these estimators are robust to time-varying cross-sectional variance and may be more accurate than RSR and some alternative methods of RSR.
Download or read book Parametric and Nonparametric Volatility Measurement written by Torben Gustav Andersen and published by . This book was released on 2002 with total page 84 pages. Available in PDF, EPUB and Kindle. Book excerpt: Volatility has been one of the most active areas of research in empirical finance and time series econometrics during the past decade. This chapter provides a unified continuous-time, frictionless, no-arbitrage framework for systematically categorizing the various volatility concepts, measurement procedures, and modeling procedures. We define three different volatility concepts: (i) the notional volatility corresponding to the ex-post sample-path return variability over a fixed time interval, (ii) the ex-ante expected volatility over a fixed time interval, and (iii) the instantaneous volatility corresponding to the strength of the volatility process at a point in time. The parametric procedures rely on explicit functional form assumptions regarding the expected and/or instantaneous volatility. In the discrete-time ARCH class of models, the expectations are formulated in terms of directly observable variables, while the discrete- and continuous-time stochastic volatility models involve latent state variable(s). The nonparametric procedures are generally free from such functional form assumptions and hence afford estimates of notional volatility that are flexible yet consistent (as the sampling frequency of the underlying returns increases). The nonparametric procedures include ARCH filters and smoothers designed to measure the volatility over infinitesimally short horizons, as well as the recently-popularized realized volatility measures for (non-trivial) fixed-length time intervals.
- Author : Stephanie Schmitt-Grohé
- Publisher :
- Release : 2001
- ISBN :
- Pages : 42 pages
Solving Dynamic General Equilibrium Models Using a Second order Approximation to the Policy Function
Download or read book Solving Dynamic General Equilibrium Models Using a Second order Approximation to the Policy Function written by Stephanie Schmitt-Grohé and published by . This book was released on 2001 with total page 42 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper derives a second-order approximation to the solution of a general class of discrete- time rational expectations models. The main theoretical contribution of the paper is to show that for any model belonging to the general class considered, the coefficients on the terms linear and quadratic in the state vector in a second-order expansion of the decision rule are independent of the volatility of the exogenous shocks. In other words, these coefficients must be the same in the stochastic and the deterministic versions of the model. Thus, up to second order, the presence of uncertainty affects only the constant term of the decision rules. In addition, the paper presents a set of MATLAB programs designed to compute the coefficients of the second-order approximation. The validity and applicability of the proposed method is illustrated by solving the dynamics of a number of model economies.
Download or read book Testing for Weak Instruments in Linear IV Regression written by James H. Stock and published by . This book was released on 2002 with total page 82 pages. Available in PDF, EPUB and Kindle. Book excerpt: Weak instruments can produce biased IV estimators and hypothesis tests with large size distortions. But what, precisely, are weak instruments, and how does one detect them in practice? This paper proposes quantitative definitions of weak instruments based on the maximum IV estimator bias, or the maximum Wald test size distortion, when there are multiple endogenous regressors. We tabulate critical values that enable using the first-stage F-statistic (or, when there are multiple endogenous regressors, the Cragg-Donald (1993) statistic) to test whether given instruments are weak. A technical contribution is to justify sequential asymptotic approximations for IV statistics with many weak instruments.
Download or read book Multinomial Choice with Social Interactions written by William A. Brock and published by . This book was released on 2003 with total page 62 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper develops a model of individual decisionmaking in the presence of social interactions when the number of available choices is finite. We show how a multinomial logit model framework may be used to model such decisions in a way that permits a tight integration of theory and econometrics. Conditions are given under which aggregate choice behavior in a population exhibits multiple self-consistent equilibria. An econometric version of the model is shown to be identified under relatively weka conditions. That analysis is extended to allow for general error distributions and some preliminary ways to account for the endogeneity of group memberships are developed.
Download or read book Empirical Bayes Forecasts of One Time Series Using Many Predictors written by Thomas Knox and published by . This book was released on 2001 with total page 168 pages. Available in PDF, EPUB and Kindle. Book excerpt: We consider both frequentist and empirical Bayes forecasts of a single time series using a linear model with T observations and K orthonormal predictors. The frequentist formulation considers estimators that are equivariant under permutations (reorderings) of the regressors. The empirical Bayes formulation (both parametric and nonparametric) treats the coefficients as i.i.d. and estimates their prior. Asymptotically, when K is proportional to T the empirical Bayes estimator is shown to be: (i) optimal in Robbins' (1955, 1964) sense; (ii) the minimum risk equivariant estimator; and (iii) minimax in both the frequentist and Bayesian problems over a class of nonGaussian error distributions. Also, the asymptotic frequentist risk of the minimum risk equivariant estimator is shown to equal the Bayes risk of the (infeasible subjectivist) Bayes estimator in the Gaussian case, where the 'prior' is the weak limit of the empirical cdf of the true parameter values. Monte Carlo results are encouraging. The new estimators are used to forecast monthly postwar U.S. macroeconomic time series using the first 151 principal components from a large panel of predictors.
Download or read book Identification and Inference in Nonlinear Difference in differences Models written by Susan Athey and published by . This book was released on 2002 with total page 78 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper develops an alternative approach to the widely used Difference-In-Difference (DID) method for evaluating the effects of policy changes. In contrast to the standard approach, we introduce a nonlinear model that permits changes over time in the effect of unobservables (e.g., there may be a time trend in the level of wages as well as the returns to skill in the labor market). Further, our assumptions are independent of the scaling of the outcome. Our approach provides an estimate of the entire counterfactual distribution of outcomes that would have been experienced by the treatment group in the absence of the treatment, and likewise for the untreated group in the presence of the treatment. Thus, it enables the evaluation of policy interventions according to criteria such as a mean-variance tradeoff. We provide conditions under which the model is nonparametrically identified and propose an estimator. We consider extensions to allow for covariates and discrete dependent variables. We also analyze inference, showing that our estimator is root-N consistent and asymptotically normal. Finally, we consider an application.
Download or read book Using Weights to Adjust for Sample Selection when Auxiliary Information is Available written by Aviv Nevo and published by . This book was released on 2001 with total page 48 pages. Available in PDF, EPUB and Kindle. Book excerpt: In this paper I analyze GMM estimation when the sample is not a random draw from the population of interest. I exploit auxiliary information, in the form of moments from the population of interest, in order to compute weights that are proportional to the inverse probability of selection. The essential idea is to construct weights, for each observation in the primary data, such that the moments of the weighted data are set equal to the additional moments. The estimator is applied to the Dutch Transportation Panel, in which refreshment draws were taken from the population of interest in order to deal with heavy attrition of the original panel. I show how these additional samples can be used to adjust for sample selection.
Download or read book On the Relationship Between Determinate and MSV Solutions in Linear RE Models written by Bennett T. McCallum and published by . This book was released on 2004 with total page 32 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper considers the possibility that, in linear rational expectations (RE) models, all determinate (uniquely non-explosive) solutions coincide with the minimum state variable (MSV) solution, which is unique by construction. In univariate specifications of the form y(t) = AE(t)y(t+1) + Cy(t-1) + u(t) that result holds: if a RE solution is unique and non-explosive, then it is the same as the MSV solution. Also, this result holds for multivariate versions if the A and C matrices commute and a certain regularity condition holds. More generally, however, there are models of this form that possess unique non-explosive solutions that differ from their MSV solutions. Examples are provided and a strategy for easily constructing others is outlined.