EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Efficient Treatment Effect Estimation with Dimension Reduction

Download or read book Efficient Treatment Effect Estimation with Dimension Reduction written by Ying Zhang and published by . This book was released on 2018 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Estimation of average and quantile treatment effects is crucial in causal inference for evaluation of treatments or interventions in biomedical, economic, and social studies. Under the assumption of treatment and potential outcomes are independent conditional on all covariates, valid treatment effect estimators can be obtained using nonparametric inverse propensity weighting and/or regression, which are popular because no model on propensity or regression is imposed. To obtain valid and efficient treatment effect estimators, typically the set of all covariates can be replaced by lower dimensional sets containing linear combinations of covariates. We propose to construct a lower dimensional set separately for each treatment and show that the resulting asymptotic variance of treatment effect estimator reaches a lower bound that is smaller than those based on other sets. Since the lower dimensional sets have to be constructed, for example, using nonparametric sufficient dimension reduction, we derive theoretical results on when the efficiency of treatment effect estimation is affected by sufficient dimension reduction. We find that, except for some special cases, the efficiency of treatment effect estimation is affected even though the sufficient dimension reduction is consistent in the rate of the square root of the sample size. As causal setting is similar with that of missing data, we apply the same technics to handle missing covariate value problems in estimating equations. Our theory is complemented by some simulation results. We use the data from the University of Wisconsin Health Accountable Care Organization as an example for average/quantile treatment effects estimations, and the automobile data from University of California-Irvine as an example for estimating regression parameters in estimating equations with missing covariate value.

Book The Value of Knowing the Propensity Score for Estimating Average Treatment Effects

Download or read book The Value of Knowing the Propensity Score for Estimating Average Treatment Effects written by Christoph Rothe and published by . This book was released on 2016 with total page 27 pages. Available in PDF, EPUB and Kindle. Book excerpt: In a treatment effect model with unconfoundedness, treatment assignments are not only independent of potential outcomes given the covariates, but also given the propensity score alone. Despite this powerful dimension reduction property, adjusting for the propensity score is known to lead to an estimator of the average treatment effect with lower asymptotic efficiency than one based on adjusting for all covariates. Moreover, knowledge of the propensity score does not change the efficiency bound for estimating average treatment effects, and many empirical strategies are more efficient when an estimate of the propensity score is used instead of its true value. Here, we resolve this "propensity score paradox" by demonstrating the value of knowledge of the propensity score.We show that by exploiting such knowledge properly, it is possible to construct an efficient treatment effect estimator that is not affected by the "curse of dimensionality", which yields desirable second order asymptotic properties and finite sample performance. The method combines knowledge of the propensity score with a nonparametric adjustment for covariates, building on ideas from the literature on double robust estimation. It is straightforward to implement, and performs well in simulations. We also show that confidence intervals based on our estimator and a simple variance estimate have remarkably robust coverage properties with respect to the implementation details of the nonparametric adjustment step.

Book Essays on Treatment Effect Estimation and Treatment Choice Learning

Download or read book Essays on Treatment Effect Estimation and Treatment Choice Learning written by Liqiang Shi and published by . This book was released on 2022 with total page 119 pages. Available in PDF, EPUB and Kindle. Book excerpt: This dissertation consists of three chapters that study treatment effect estimation and treatment choice learning under the potential outcome framework (Neyman, 1923; Rubin, 1974). The first two chapters study how to efficiently combine an experimental sample with an auxiliary observational sample when estimating treatment effects. In chapter 1, I derive a new semiparametric efficiency bound under the two-sample setup for estimating ATE and other functions of the average potential outcomes. The efficiency bound for estimating ATE with an experimental sample alone is derived in Hahn (1998) and has since become an important reference point for studies that aim at improving the ATE estimation. This chapter answers how an auxiliary sample containing only observable characteristics (covariates, or features) can lower this efficiency bound. The newly obtained bound has an intuitive expression and shows that the (maximum possible) amount of variance reduction depends positively on two factors: 1) the size of the auxiliary sample, and 2) how well the covariates predict the individual treatment effect. The latter naturally motivates having high dimensional covariates and the adoption of modern machine learning methods to avoid over-fitting. In chapter 2, under the same setup, I propose a two-stage machine learning (ML) imputation estimator that achieves the efficiency bound derived in chapter 1, so that no other regular estimators for ATE can have lower asymptotic variance in the same setting. This estimator involves two steps. In the first step, conditional average potential outcome functions are estimated nonparametrically via ML, which are then used to impute the unobserved potential outcomes for every unit in both samples. In the second step, the imputed potential outcomes are aggregated together in a robust way to produce the final estimate. Adopting the cross-fitting technique proposed in Chernozhukov et al. (2018), our two-step estimator can use a wide range of supervised ML tools in its first step, while maintaining valid inference to construct confidence intervals and perform hypothesis tests. In fact, any method that estimates the relevant conditional mean functions consistently in square norm, with no rate requirement, will lead to efficiency through the proposed two-step procedure. I also show that cross-fitting is not necessary when the first step is implemented via LASSO or post-LASSO. Furthermore, our estimator is robust in the sense that it remains consistent and root n normal (no longer efficient) even if the first step estimators are inconsistent. Chapter 3 (coauthored with Kirill Ponomarev) studies model selection in treatment choice learning. When treatment effects are heterogeneous, a decision maker, given either experiment or quasi-experiment data, can attempt to find a policy function that maps observable characteristics to treatment choices, aiming at maximizing utilitarian welfare. When doing so, one often has to pick a constrained class of functions as candidates for the policy function. The choice of this function class poses a model selection problem. Following Mbakop and Tabord-Meehan (2021) we propose a policy learning algorithm that incorporates data-driven model selection. Our method also leverages doubly robust estimation (Athey and Wager, 2021) so that it could retain the optimal root n rate in expected regret in general setups including quasi-experiments where propensity scores are unknown. We also refined some related results in the literature and derived a new finite sample lower bound on expected regret to show that the root n rate is indeed optimal.

Book Treatment Effect Estimation with Censored Outcome and Covariate Selection

Download or read book Treatment Effect Estimation with Censored Outcome and Covariate Selection written by Li Li and published by . This book was released on 2023 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Covariates selection is essential when faced with many variables in modern causal inference in a data-rich environment. Particularly, the efficiency of the average causal effect (ACE) can be improved by including covariates only related to the outcome and reduced by including covariates related to the treatment but not the outcome in the propensity score (PS) model. In this paper, we estimate the causal effect in the presence of censored outcome and high-dimensional covariates. To improve the efficiency of the estimation of ACE, we propose the censored outcome adaptive Lasso (COAL) to select covariates, where the weighted least square method is applied to account for censoring. Based on the covariate selection, we propose a double inverse propensity weighted estimator for ACE. Furthermore, we establish the oracle properties of the variable selection and derive the asymptotic properties of the proposed estimator.

Book Developing a Protocol for Observational Comparative Effectiveness Research  A User s Guide

Download or read book Developing a Protocol for Observational Comparative Effectiveness Research A User s Guide written by Agency for Health Care Research and Quality (U.S.) and published by Government Printing Office. This book was released on 2013-02-21 with total page 236 pages. Available in PDF, EPUB and Kindle. Book excerpt: This User’s Guide is a resource for investigators and stakeholders who develop and review observational comparative effectiveness research protocols. It explains how to (1) identify key considerations and best practices for research design; (2) build a protocol based on these standards and best practices; and (3) judge the adequacy and completeness of a protocol. Eleven chapters cover all aspects of research design, including: developing study objectives, defining and refining study questions, addressing the heterogeneity of treatment effect, characterizing exposure, selecting a comparator, defining and measuring outcomes, and identifying optimal data sources. Checklists of guidance and key considerations for protocols are provided at the end of each chapter. The User’s Guide was created by researchers affiliated with AHRQ’s Effective Health Care Program, particularly those who participated in AHRQ’s DEcIDE (Developing Evidence to Inform Decisions About Effectiveness) program. Chapters were subject to multiple internal and external independent reviews. More more information, please consult the Agency website: www.effectivehealthcare.ahrq.gov)

Book Semiparametric Approaches for Average Causal Effect and Precision Medicine

Download or read book Semiparametric Approaches for Average Causal Effect and Precision Medicine written by Trinetri Ghosh and published by . This book was released on 2021 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Average causal effect is often used to compare the treatments or interventions in both randomized and observational studies. It has a wide variety of applications in medical, natural, and social sciences, for example, psychology, political science, economics, and so on. Due to the increased availability of high-dimensional pre-treatment information sets, dimension reduction is a major methodological issue in observational studies to estimate the average causal effect of a non-randomized treatment. Often assumptions are made to ensure model identifiability and to establish theoretical guarantees for nuisance conditional models. But these assumptions can be less flexible. In the first work (Chapter 2), to estimate the average causal effect in an observational study, we use a semiparametric locally efficient dimension-reduction approach to assess the treatment assignment mechanisms and average responses in both the treated and the non-treated groups. We then integrate our results using imputation, inverse probability weighting, and doubly robust augmentation estimators. Doubly robust estimators are locally efficient, and imputation estimators are super-efficient when the response models are correct. To take advantage of both procedures, we introduce a shrinkage estimator that combines the two. The proposed estimators retain the double robustness property while improving on the variance when the response model is correct. We demonstrate the performance of these estimators using simulated experiments and a real data set on the effect of maternal smoking on baby birth weight. In the second work (Chapter 3), we implemented semiparametric efficient method in an emerging topic, precision medicine, an approach to tailoring disease prevention and treatment that takes into account individual variability in genes, environment, and lifestyle for each person. The goal of precision medicine is to deploy appropriate and optimal treatment based on the context of a patient's individual characteristics to maximize the clinical benefit. In this work, we propose a new modeling and estimation approach to select the optimal treatment regime from two different options through constructing a robust estimating equation. The method is protected against misspecification of the propensity score function or the outcome regression model for the non-treated group or the potential non-monotonic treatment difference model. Nonparametric smoothing and dimension reduction are incorporated to estimate the treatment difference model. We then identify the optimal treatment by maximizing the value function and established theoretical properties of the treatment assignment strategy. We illustrate the performance and effectiveness of our proposed estimators through extensive simulation studies and a real-world application to Huntington's disease patients. In the third work (Chapter 4), we aim to obtain optimal individualized treatment rules in the covariate-adjusted randomization clinical trial with many covariates. We model the treatment effect with an unspecified function of a single index of the covariates and leave the baseline response completely arbitrary. We devise a class of estimators to consistently estimate the treatment effect function and its associated index while bypassing the estimation of the baseline response, which is subject to the curse of dimensionality. We further develop inference tools to identify predictive covariates and isolate effective treatment regions. The usefulness of the methods is demonstrated in both simulations and a clinical data example.

Book Efficient Estimation of Average Treatment Effects Using the Estimated Propensity Score

Download or read book Efficient Estimation of Average Treatment Effects Using the Estimated Propensity Score written by Keisuke Hirano and published by . This book was released on 2000 with total page 68 pages. Available in PDF, EPUB and Kindle. Book excerpt: We are interested in estimating the average effect of a binary treatment on a scalar outcome. If assignment to the treatment is independent of the potential outcomes given pretreatment variables, biases associated with simple treatment-control average comparisons can be removed by adjusting for differences in the pre-treatment variables. Rosenbaum and Rubin (1983, 1984) show that adjusting solely for differences between treated and control units in a scalar function of the pre-treatment, the propensity score, also removes the entire bias associated with differences in pre-treatment variables. Thus it is possible to obtain unbiased estimates of the treatment effect without conditioning on a possibly high-dimensional vector of pre-treatment variables. Although adjusting for the propensity score removes all the bias, this can come at the expense of efficiency. We show that weighting with the inverse of a nonparametric estimate of the propensity score, rather than the true propensity score, leads to efficient estimates of the various average treatment effects. This result holds whether the pre-treatment variables have discrete or continuous distributions. We provide intuition for this result in a number of ways. First we show that with discrete covariates, exact adjustment for the estimated propensity score is identical to adjustment for the pre-treatment variables. Second, we show that weighting by the inverse of the estimated propensity score can be interpreted as an empirical likelihood estimator that efficiently incorporates the information about the propensity score. Finally, we make a connection to results to other results on efficient estimation through weighting in the context of variable probability sampling.

Book Semiparamertic Dimension Reduction Model and Applications

Download or read book Semiparamertic Dimension Reduction Model and Applications written by Ge Zhao and published by . This book was released on 2019 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: In the robust nonparametric kernel regression context, we prescribe a data driven method to select the trimming parameter and the bandwidth robustly. The estimator is obtained through solving estimating equations, and it controls the effect from outlying observations through a combination of weighting and trimming. We show asymptotic consistency, establish the estimation bias, variance properties and derive the asymptotic distribution of the resulting estimator. The finite sample performance of the estimator is illustrated through both simulation studies and analysis on a problem related to wind power generation, which motivated this study at the first place.We propose a general index model for survival data, which generalizes many commonly used semiparametric survival models and belongs to the framework of dimension reduction. Using a combination of geometric approach in semiparametrics and martingale treatment in survival data analysis, we devise estimation procedures that are feasible and do not require covariate-independent censoring as assumed in many dimension reduction methods for censored survival data. We establish the root-$n$ consistency and asymptotic normality of the proposed estimators and derive the most efficient estimator in this class forthe general index model. Numerical experiments are carried out to demonstrate the empirical performance of the proposed estimators and an application to an AIDS data further illustrates the usefulness of the work.Kidney transplantation is the most effective renal replacement therapy for renal failure patients. With the severe shortage of kidney supplies and for the clinical effectiveness of transplantation, it would be crucial to design objective measures, such as the Estimated Post-Transplant Survival (EPTS) score, to quantify the benefit that a renal failure patient would gain from a potential transplantation by comparing the expected residual lives of the same patient with and without transplant. However, in the current EPTS system, the mostdominant predictors are severe comorbidity conditions (such as diabetes) and age, which might preclude old and sick patients for receiving transplants. To help design a morefair score system, we propose a flexible and general covariate-dependent mean residual life model to estimate EPTS. Our method is both efficient and robust as the covariate effect is estimated via a semiparametrically efficient estimator, while the mean residual life function is estimated nonparametrically. We further provide a formula to predict the residual life increment potential for any given patients. Our method would facilitate allocating kidneys to patients who would have the largest residual life increment among all the eligibles. Our analysis of the kidney transplant data from the U.S. Scientific Registry of Transplant Recipients indicated that the most important predictor is the waiting time for transplantation: a shorter waiting time may lead to larger potential gains. We also identified an index which could serve as an additional important predictor if the waiting time is approximately between 1.5 years and three years. As our framework is general, we envision that our analytical strategies can be adopted to other organ transplantation settings.

Book Nonparametric and Semiparametric Models

Download or read book Nonparametric and Semiparametric Models written by Wolfgang Karl Härdle and published by Springer Science & Business Media. This book was released on 2012-08-27 with total page 317 pages. Available in PDF, EPUB and Kindle. Book excerpt: The statistical and mathematical principles of smoothing with a focus on applicable techniques are presented in this book. It naturally splits into two parts: The first part is intended for undergraduate students majoring in mathematics, statistics, econometrics or biometrics whereas the second part is intended to be used by master and PhD students or researchers. The material is easy to accomplish since the e-book character of the text gives a maximum of flexibility in learning (and teaching) intensity.

Book Semiparametric Efficient Estimation of Treatment Effect in a Pretest Posttest Study with Missing Data

Download or read book Semiparametric Efficient Estimation of Treatment Effect in a Pretest Posttest Study with Missing Data written by and published by . This book was released on 2004 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: Inference on treatment effect in a pretest-posttest study is a routine objective in medicine, public health, and other fields, and a number of approaches have been advocated. Typically, subjects are randomized to two treatments, the response is measured at baseline and a prespecified follow & ndash;up time, and interest focuses on the effect of treatment on follow--up mean response. Covariate information at baseline and in the intervening period until follow--up may also be collected. Missing posttest response for some subjects is routine, and disregarding these missing cases can lead to biased and inefficient inference. Despite the widespread popularity of this design, a consensus on an appropriate method of analysis when no data are missing, let alone on an accepted practice for taking account of missing follow--up response, does not exist. We take a semiparametric perspective, making no assumptions about the distributions of baseline and posttest responses. Exploiting the work of Robins et al. (1994), we characterize the class of all consistent estimators for treatment effect, identify the efficient member of this class, and propose practical procedures for implementation. The result is a unified framework for handling pretest--posttest inferences when follow--up response may be missing at random that allows the analyst to incorporate baseline and intervening information so as to improve efficiency of inference. Simulation studies and application to data from an HIV clinical trial illustrate the utility of the approach.

Book Vetus latina

    Book Details:
  • Author :
  • Publisher :
  • Release : 1953
  • ISBN :
  • Pages : pages

Download or read book Vetus latina written by and published by . This book was released on 1953 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Multivariate Analysis of Variance and Repeated Measures

Download or read book Multivariate Analysis of Variance and Repeated Measures written by David J. Hand and published by CRC Press. This book was released on 1987-05-01 with total page 284 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book describes a practical aproach to univariate and multivariate analysis of variance. It starts with a general non-mathematical account of the fundamental theories and this is followed by a discussion of a series of examples using real data sets from the authors' own work in clinical trials, psychology and industry. Included are discussions of factorial and nested designs, structures on the multiple dependent variables measured on each subject, repeated measures analyses, covariates, choice of text statistic and simultaneous test procedures.

Book A Unified Framework for Efficient Estimation of General Treatment Models

Download or read book A Unified Framework for Efficient Estimation of General Treatment Models written by Chunrong Ai and published by . This book was released on 2019 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: This paper presents a weighted optimization framework that unifies the binary, multi-valued, continuous, as well as mixture of discrete and continuous treatment, under unconfounded treatment assignment. With a general loss function, the framework includes the average, quantile and asymmetric least squares causal effect of treatment as special cases. For this general framework, we first derive the semiparametric efficiency bound for the causal effect of treatment, extending the existing bound results to a wider class of models. We then propose a generalized optimization estimator for the causal effect with weights estimated by solving an expanding set of equations. Under some sufficient conditions, we establish the consistency and asymptotic normality of the proposed estimator of the causal effect and show that the estimator attains the semiparametric efficiency bound, thereby extending the existing literature on efficient estimation of causal effect to a wider class of applications. Finally, we discuss estimation of some causal effect functionals such as the treatment effect curve and the average outcome. To evaluate the finite sample performance of the proposed procedure, we conduct a small-scale simulation study and find that the proposed estimation has practical value. To illustrate the applicability of the procedure, we revisit the literature on campaign advertising and campaign contributions. Unlike the existing procedures, which produce mixed results, we find no evidence of campaign advertising on campaign contribution.

Book Econometric Evaluation of Labour Market Policies

Download or read book Econometric Evaluation of Labour Market Policies written by Michael Lechner and published by Springer Science & Business Media. This book was released on 2012-12-06 with total page 248 pages. Available in PDF, EPUB and Kindle. Book excerpt: Empirical measurement of impacts of active labour market programmes has started to become a central task of economic researchers. New improved econometric methods have been developed that will probably influence future empirical work in various other fields of economics as well. This volume contains a selection of original papers from leading experts, among them James J. Heckman, Noble Prize Winner 2000 in economics, addressing these econometric issues at the theoretical and empirical level. The theoretical part contains papers on tight bounds of average treatment effects, instrumental variables estimators, impact measurement with multiple programme options and statistical profiling. The empirical part provides the reader with econometric evaluations of active labour market programmes in Canada, Germany, France, Italy, Slovak Republic and Sweden.

Book An Introduction to Envelopes

Download or read book An Introduction to Envelopes written by R. Dennis Cook and published by John Wiley & Sons. This book was released on 2018-08-28 with total page 317 pages. Available in PDF, EPUB and Kindle. Book excerpt: Written by the leading expert in the field, this text reviews the major new developments in envelope models and methods An Introduction to Envelopes provides an overview of the theory and methods of envelopes, a class of procedures for increasing efficiency in multivariate analyses without altering traditional objectives. The author offers a balance between foundations and methodology by integrating illustrative examples that show how envelopes can be used in practice. He discusses how to use envelopes to target selected coefficients and explores predictor envelopes and their connection with partial least squares regression. The book reveals the potential for envelope methodology to improve estimation of a multivariate mean. The text also includes information on how envelopes can be used in generalized linear models, regressions with a matrix-valued response, and reviews work on sparse and Bayesian response envelopes. In addition, the text explores relationships between envelopes and other dimension reduction methods, including canonical correlations, reduced-rank regression, supervised singular value decomposition, sufficient dimension reduction, principal components, and principal fitted components. This important resource: • Offers a text written by the leading expert in this field • Describes groundbreaking work that puts the focus on this burgeoning area of study • Covers the important new developments in the field and highlights the most important directions • Discusses the underlying mathematics and linear algebra • Includes an online companion site with both R and Matlab support Written for researchers and graduate students in multivariate analysis and dimension reduction, as well as practitioners interested in statistical methodology, An Introduction to Envelopes offers the first book on the theory and methods of envelopes.

Book Heterogeneous Treatment Effect Estimation in Observational Studies Using Tree based Methods

Download or read book Heterogeneous Treatment Effect Estimation in Observational Studies Using Tree based Methods written by Yuyang Zhang and published by . This book was released on 2020 with total page 167 pages. Available in PDF, EPUB and Kindle. Book excerpt: Observational studies provide a rich source of data for evaluating causal relationships. Appropriate statistical methods for causal inference should be developed to account for the non-randomized nature of observational studies. Matching design is commonly used to deal with this non-randomized issue as it is robust to the model misspecification. To goal of this work is to use the matching design to perform causal inference in population and subpopulation. Propensity score is a powerful tool for adjusting observed confounding bias when there are a large number of confounders. Relatively few studies have focused on whether the post-matching analysis should adjust for the matching structure when estimate the population treatment effect. In the first part of the thesis, we compare results under different strategies with and without the matching design for both continuous outcome and binary outcome and discuss whether the post-matching should take into account when the treatment effect is homogeneous. \cite{zhang2020accounting} However, treatment effects are likely to be different across different subpopulations, especially in a real-world problem. We then propose a non-parametric matching tree (MT) to tackle both confounding adjustment and subgroup identification at the same time by combining the machine learning methods with matching designs. We prove that it produces unbiased subpopulation treatment effect estimators. To evaluate the performance of the proposed method, we run extensive simulation studies to compare it with popular tree-based causal inference methods. We apply the proposed method to examine the impact of Tobramycin for the patients' first pseudomonas aeruginosa chronic infection in Cystic Fibrosis disease in the U.S. We finally discuss limitations and potential future works.