EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Generating Non standard Multivariate Distributions with an Application to Mismeasurement in the CPI

Download or read book Generating Non standard Multivariate Distributions with an Application to Mismeasurement in the CPI written by Matthew David Shapiro and published by . This book was released on 1996 with total page 36 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper shows how to generate the joint distribution of correlated random variables with specified marginal distributions. For cases where the marginal distributions are either normal or lognormal, it shows how to calculate analytically the correlation of the underlying normal distributions to induce the desired correlation between the variables. It also provides a method for calculating the joint distribution in the case of arbitrary marginal distributions. The paper applies the technique to calculating the distribution of the overall bias in the consumer price index. The technique should also be applicable to estimation by simulated moments or simulated likelihoods and to Monte Carlo analysis.

Book NBER Macroeconomics Annual 1996

Download or read book NBER Macroeconomics Annual 1996 written by Ben S. Bernanke and published by MIT Press. This book was released on 1997-02 with total page 420 pages. Available in PDF, EPUB and Kindle. Book excerpt: This is the eleventh volume in a series of annuals from the National Bureau of Economic Research that are designed to present, extend, and apply frontier work in macroeconomics, and to encourage and stimulate work by macroeconomists on current policy issues. These contributions offer a good sample of the current issues and exciting research directions in macroeconomics. Contents Credit, Business Investment, and Output Fluctuations in Japan, Nobuhiro Kiyotaki and Kenneth D. West * Causes and Consequences of Imperfections in the Consumer Price Index, Matthew D. Shapiro and David Wilcox * A Scorecard for Indexed Government Debt, John Y. Campbell and Robert J. Shiller * Technology Improvements and Productivity Slowdowns: Another Crazy Explanation, Andreas Hornstein and Per Krusell * Are Currency Crises Self-Fulfilling?, Paul Krugman * Inequity and Growth, Roland Benabou

Book Flexible and Generalized Uncertainty Optimization

Download or read book Flexible and Generalized Uncertainty Optimization written by Weldon A. Lodwick and published by Springer. This book was released on 2017-01-17 with total page 197 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book presents the theory and methods of flexible and generalized uncertainty optimization. Particularly, it describes the theory of generalized uncertainty in the context of optimization modeling. The book starts with an overview of flexible and generalized uncertainty optimization. It covers uncertainties that are both associated with lack of information and that more general than stochastic theory, where well-defined distributions are assumed. Starting from families of distributions that are enclosed by upper and lower functions, the book presents construction methods for obtaining flexible and generalized uncertainty input data that can be used in a flexible and generalized uncertainty optimization model. It then describes the development of such a model in detail. All in all, the book provides the readers with the necessary background to understand flexible and generalized uncertainty optimization and develop their own optimization model.

Book Monotone Instrumental Variables with an Application to the Returns to Schooling

Download or read book Monotone Instrumental Variables with an Application to the Returns to Schooling written by Charles F. Manski and published by . This book was released on 1999 with total page 62 pages. Available in PDF, EPUB and Kindle. Book excerpt: Econometric analyses of treatment response commonly use instrumental variable (IV) assumptions to identify treatment effects. Yet the credibility of IV assumptions is often a matter of considerable disagreement, with much debate about whether some covariate is or is not a "valid instrument" in an application of interest. There is therefore good reason to consider weaker but more credible assumptions. assumptions. To this end, we introduce monotone instrumental variable (MIV) A particularly interesting special case of an MIV assumption is monotone treatment selection (MTS). IV and MIV assumptions may be imposed alone or in combination with other assumptions. We study the identifying power of MIV assumptions in three informational settings: MIV alone; MIV combined with the classical linear response assumption; MIV combined with the monotone treatment response (MTR) assumption. We apply the results to the problem of inference on the returns to schooling. We analyze wage data reported by white male respondents to the National Longitudinal Survey of Youth (NLSY) and use the respondent's AFQT score as an MIV. We find that this MIV assumption has little identifying power when imposed alone. However combining the MIV assumption with the MTR and MTS assumptions yields fairly tight bounds on two distinct measures of the returns to schooling.

Book An Optimization based Econometric Framework for the Evaluation of Monetary Policy

Download or read book An Optimization based Econometric Framework for the Evaluation of Monetary Policy written by Julio Rotemberg and published by . This book was released on 1998 with total page 84 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper considers a simple quantitative model of output, interest rate and inflation determination in the United States, and uses it to evaluate alternative rules by which the Fed may set interest rates. The model is derived from optimizing behavior under rational expectations, both on the part of the purchasers of goods and upon that of the sellers. The model matches the estimates responses to a monetary policy shock quite well and, once due account is taken of other disturbances, can account for our data nearly as well as an unrestricted VAR. The monetary policy rule that most reduces inflation variability (and is best on this account) requires very variable interest rates, which in turn is possible only in the case of a high average inflation rate. But even in the case of a constrained-optimal policy, that takes into account some of the costs of average inflation and constrains the variability of interest rates so as to keep average inflation low, inflation would be stabilized considerably more and output stabilized considerably less than under our estimates of current policy. Moreover, this constrained-optimal policy also allows average inflation to be much smaller. This version contains additional details of our derivations and calculations, including three technical appendices, not included in the version published in NBER Macroeconomics Annual 1997.

Book Asymptotically Median Unbiased Estimation of Coefficient Variance in a Time Varying Parameter Model

Download or read book Asymptotically Median Unbiased Estimation of Coefficient Variance in a Time Varying Parameter Model written by James H. Stock and published by . This book was released on 1996 with total page 48 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper considers the estimation of the variance of coefficients in time varying parameter models with stationary regressors. The maximum likelihood estimator has large point mass at zero. We therefore develop asymptotically median unbiased estimators and confidence intervals by inverting median functions of regression-based parameter stability test statistics, computed under the constant-parameter null. These estimators have good asymptotic relative efficiencies for small to moderate amounts of parameter variability. We apply these results to an unobserved components model of trend growth in postwar U.S. GDP: the MLE implies that there has been no change in the trend rate, while the upper range of the median-unbiased point estimates imply that the annual trend growth rate has fallen by 0.7 percentage points over the postwar period.

Book Observational Agency and Supply side Econometrics

Download or read book Observational Agency and Supply side Econometrics written by Tomas J. Philipson and published by . This book was released on 1997 with total page 58 pages. Available in PDF, EPUB and Kindle. Book excerpt: A central problem in applied empirical work is to separate out the patterns in the data that are due to poor production of the data, such as e.g. non-response and measurement errors, from the patterns attributable to the economic phenomena studied. This paper interprets this inference problem as being an agency problem in the market for observations and suggests ways in which using incentives may be useful to overcome it. The paper discusses how wage discrimination may be used for identification of economic parameters of interest taking into account the responses in survey supply by sample members to that discrimination. Random wage discrimination alters the supply behavior of sample members across the same types of populations in terms of outcomes and thereby allows for separating out poor supply from the population parameters of economic interest. Empirical evidence for a survey of US physicians suggests that survey supply even for this wealthy group is affected by the types of wage discrimination schemes discussed in a manner that makes the schemes useful for identification purposes. Using such schemes to correct mean estimates of physician earnings increases those earnings by about one third.

Book Solving Large Scale Rational Expectation Models

Download or read book Solving Large Scale Rational Expectation Models written by Jess Gaspar and published by . This book was released on 1997 with total page 60 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Nonparametric Applications of Bayesian Inference

Download or read book Nonparametric Applications of Bayesian Inference written by Gary Chamberlain and published by . This book was released on 1996 with total page 36 pages. Available in PDF, EPUB and Kindle. Book excerpt: The paper evaluates the usefulness of a nonparametric approach to Bayesian inference by presenting two applications. The approach is due to Ferguson (1973, 1974) and Rubin (1981). Our first application considers an educational choice problem. We focus on obtaining a predictive distribution for earnings corresponding to various levels of schooling. This predictive distribution incorporates the parameter uncertainty, so that it is relevant for decision making under uncertainty in the expected utility framework of microeconomics. The second application is to quantile regression. Our point here is to examine the potential of the nonparametric framework to provide inferences without making asymptotic approximations. Unlike in the first application, the standard asymptotic normal approximation turns out to not be a good guide. We also consider a comparison with a bootstrap approach.

Book An Efficient Generalized Discrete time Approach to Poisson Gaussian Bond Option Pricing in the Heath Jarrow Morton Model

Download or read book An Efficient Generalized Discrete time Approach to Poisson Gaussian Bond Option Pricing in the Heath Jarrow Morton Model written by Sanjiv Ranjan Das and published by . This book was released on 1997 with total page 60 pages. Available in PDF, EPUB and Kindle. Book excerpt: Term structure models employing Poisson-Gaussian processes may be used to accommodate the observed skewness and kurtosis of interest rates. This paper extends the discrete-time, pure-Gaussian version of the Heath-Jarrow-Morton model to the pricing" of American-type bond options when the underlying term structure of interest rates follows a Poisson-Gaussian process. The Poisson-Gaussian process is specified using a hexanomial tree (six nodes emanating from each node), and the tree is shown to be recombining. The scheme is parsimonious and convergent. This model extends the class of HJM models by (i) introducing a more generalized volatility specification than has been used so far, and (ii) inducting jumps, yet retaining lattice recombination, thus making the model useful for practical applications

Book Statistical Mechanics Approaches to Socioeconomic Behavior

Download or read book Statistical Mechanics Approaches to Socioeconomic Behavior written by Steven N. Durlauf and published by . This book was released on 1996 with total page 50 pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper provides a unified framework for interpreting a wide range of interactions models which have appeared in the economics literature. A formalization taken from the statistical mechanics literature is shown to encompass a number of socioeconomic phenomena ranging from out of wedlock births to aggregate output to crime. The framework bears a close relationship to econometric models of discrete choice and therefore holds the potential for rendering interactions models estimable. A number of new applications of statistical mechanics to socioeconomic problems are suggested.

Book Combining Panel Data Sets with Attrition and Refreshment Samples

Download or read book Combining Panel Data Sets with Attrition and Refreshment Samples written by Keisuke Hirano and published by . This book was released on 1998 with total page 54 pages. Available in PDF, EPUB and Kindle. Book excerpt: In many fields researchers wish to consider statistical models that allow for more complex relationships than can be inferred using only cross-sectional data. Panel or longitudinal data where the same units are observed repeatedly at different points in time can often provide the richer data needed for such models. Although such data allows researchers to identify more complex models than cross-sectional data, missing data problems can be more severe in panels. In particular, even units who respond in initial waves of the panel may drop out in subsequent waves, so that the subsample with complete data for all waves of the panel can be less representative of the population than the original sample. Sometimes, in the hope of mitigating the effects of attrition without losing the advantages of panel data over cross-sections, panel data sets are augmented by replacing units who have dropped out with new units randomly sampled from the original population. Following Ridder (1992), who used these replacement units to test some models for attrition, we call such additional samples refreshment samples. We explore the benefits of these samples for estimating models of attrition. We describe the manner in which the presence of refreshment samples allows the researcher to test various models for attrition in panel data, including models based on the assumption that missing data are missing at random (MAR, Rubin, 1976; Little and Rubin, 1987). The main result in the paper makes precise the extent to which refreshment samples are informative about the attrition process; a class of non-ignorable missing data models can be identified without making strong distributional or functional form assumptions if refreshment samples are available.

Book Maximum Likelihood Estimation of Discretely Sampled Diffusions

Download or read book Maximum Likelihood Estimation of Discretely Sampled Diffusions written by Yacine Aït-Sahalia and published by . This book was released on 1998 with total page 64 pages. Available in PDF, EPUB and Kindle. Book excerpt: When a continuous-time diffusion is observed only at discrete dates, not necessarily close together, the likelihood function of the observations is in most cases not explicitly computable. Researchers have relied on simulations of sample paths in between the observations points, or numerical solutions of partial differential equations, to obtain estimates of the function to be maximized. By contrast, we construct a sequence of fully explicit functions which we show converge under very general conditions, including non-ergodicity, to the true (but unknown) likelihood function of the discretely-sampled diffusion. We document that the rate of convergence of the sequence is extremely fast for a number of examples relevant in finance. We then show that maximizing the sequence instead of the true function results in an estimator which converges to the true maximum-likelihood estimator and shares its asymptotic properties of consistency, asymptotic normality and efficiency. Applications to the valuation of derivative securities are also discussed.

Book Efficient Intertemporal Allocations with Recursive Utility

Download or read book Efficient Intertemporal Allocations with Recursive Utility written by Bernard Dumas and published by . This book was released on 1999 with total page 46 pages. Available in PDF, EPUB and Kindle. Book excerpt: In this article, our objective is to determine efficient allocations in economies with multiple agents having recursive utility functions. Our main result is to show that in a multiagent economy, the problem of determining efficient allocations can be characterized in terms of a single value function (that of a social planner), rather than multiple functions (one for each investor), as has been proposed thus far (Duffie, Geoffard and Skiadas (1994)). We then show how the single value function can be identified using the familiar technique of stochastic dynamic programming. We achieve these goals by first extending to a stochastic environment Geoffard's (1996) concept of variational utility and his result that variational utility is equivalent to recursive utility, and then using these results to characterize allocations in a multiagent setting.

Book Computational Economics and Economic Theory

Download or read book Computational Economics and Economic Theory written by Kenneth L. Judd and published by . This book was released on 1997 with total page 64 pages. Available in PDF, EPUB and Kindle. Book excerpt: This essay examines the idea and potential of a computational approach to theory, ' discusses methodological issues raised by such computational methods, and outlines the problems associated with the dissemination of computational methods and the exposition of computational results. We argue that the study of a theory need not be confined to proving theorems, that current and future computer technologies create new possibilities for theoretical analysis, and that by resolving these issues we can create an intellectual atmosphere in which computational methods will make substantial contributions to economic analysis.

Book Solving Dynamic Equilibrium Models by a Method of Undetermined Coefficients

Download or read book Solving Dynamic Equilibrium Models by a Method of Undetermined Coefficients written by Lawrence J. Christiano and published by . This book was released on 1998 with total page 54 pages. Available in PDF, EPUB and Kindle. Book excerpt: I present an undetermined coefficients method for obtaining a linear approximating to the solution of a dynamic, rational expectations model. I also show how that solution can be used to compute the model's implications for impulse response functions and for second moments.

Book Hierarchical Bayes Models with Many Instrumental Variables

Download or read book Hierarchical Bayes Models with Many Instrumental Variables written by Gary Chamberlain and published by . This book was released on 1996 with total page 42 pages. Available in PDF, EPUB and Kindle. Book excerpt: In this paper, we explore Bayesian inference in models with many instrumental variables that are potentially weakly correlated with the endogenous regressor. The prior distribution has a hierarchical (nested) structure. We apply the methods to the Angrist-Krueger (AK, 1991) analysis of returns to schooling using instrumental variables formed by interacting quarter of birth with state/year dummy variables. Bound, Jaeger, and Baker (1995) show that randomly generated instrumental variables, designed to match the AK data set, give two-stage least squares results that look similar to the results based on the actual instrumental variables. Using a hierarchical model with the AK data, we find a posterior distribution for the parameter of interest that is tight and plausible. Using data with randomly generated instruments, the posterior distribution is diffuse. Most of the information in the AK data can in fact be extracted with quarter of birth as the single instrumental variable. Using artificial data patterned on the AK data, we find that if all the information had been in the interactions between quarter of birth and state/year dummies, then the hierarchical model would still have led to precise inferences, whereas the single instrument model would have suggested that there was no information in the data. We conclude that hierarchical modeling is a conceptually straightforward way of efficiently combining many weak instrumental variables.