EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Multiple Testing and Minimax Estimation in Sparse Linear Regression

Download or read book Multiple Testing and Minimax Estimation in Sparse Linear Regression written by Weijie Su and published by . This book was released on 2016 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: In many real-world statistical problems, we observe a response variable of interest together with a large number of potentially explanatory variables of which a majority may be irrelevant. For this type of problem, controlling the false discovery rate (FDR) guarantees that most of the selected variables, often termed discoveries in a scientific context, are truly explanatory and thus replicable. Inspired by ideas from the Benjamini-Hochberg procedure (BHq), this thesis proposes a new method named SLOPE to control the FDR in sparse high-dimensional linear regression. SLOPE is a computationally efficient procedure that works by regularizing the fitted coefficients according to their ranks: the higher the rank, the larger the penalty. This adaptive regularization is analogous to the BHq, which compares more significant p-values with more stringent thresholds. Under orthogonal designs, SLOPE with the BHq critical values is proven to control FDR at any given level. Moreover, we demonstrate empirically that this method also appears to have appreciable inferential properties under more general design matrices while offering substantial power. The thesis proceeds to explore the estimation properties of SLOPE. Although SLOPE was developed from a multiple testing viewpoint, we show the surprising result that it achieves optimal squared errors under Gaussian random designs. This optimality holds under a weak assumption on the l0-sparsity level of the underlying signals, and is sharp in the sense that this is the best possible error any estimator can achieve. An appealing feature is that SLOPE does not require any knowledge of the degree of sparsity, and yet automatically adapts to yield optimal total squared errors over a wide range of l0-sparsity classes. Finally, we conclude this thesis by focusing on Nesterov's accelerated scheme, which is integral to a fast algorithmic implementation of SLOPE. Specifically, we prove that, as the step size vanishes, this scheme converges in a rigorous sense to a second-order ordinary differential equation (ODE). This continuous time ODE allows for a better understanding of Nesterov's scheme, and thus it can serve as a tool for analyzing and generalizing this scheme. A fruitful application of this tool yields a family of schemes with similar convergence rates. The ODE interpretation also suggests restarting Nesterov's scheme leading to a new algorithm, which is proven to converge at a linear rate whenever the objective is strongly convex.

Book Minimax Estimation in Linear Regression Under Restrictions

Download or read book Minimax Estimation in Linear Regression Under Restrictions written by Helge Blaker and published by . This book was released on 1998 with total page 26 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Information Theoretic Methods in Data Science

Download or read book Information Theoretic Methods in Data Science written by Miguel R. D. Rodrigues and published by Cambridge University Press. This book was released on 2021-04-08 with total page 561 pages. Available in PDF, EPUB and Kindle. Book excerpt: The first unified treatment of the interface between information theory and emerging topics in data science, written in a clear, tutorial style. Covering topics such as data acquisition, representation, analysis, and communication, it is ideal for graduate students and researchers in information theory, signal processing, and machine learning.

Book Progress in Nonparametric Minimax Estimation and High Dimensional Hypothesis Testing

Download or read book Progress in Nonparametric Minimax Estimation and High Dimensional Hypothesis Testing written by Yandi Shen and published by . This book was released on 2021 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: This dissertation is divided into two parts. In the first part, we study minimax estimation of functions and functionals in nonparametric regression models. The investigation of statistical limits in such models deepens theoretical understanding in related problems and leads to new probabilistic tools and methodologies of broader interest. In the second part, we study the asymptotics in some high dimensional testing problems involving the Gaussian distribution, such as the Gaussian sequence model with convex constraint and testing of covariance matrices. A general framework is developed to analyze the power behavior of test statistics via accurate non-asymptotic expansions.

Book Statistical Foundations of Data Science

Download or read book Statistical Foundations of Data Science written by Jianqing Fan and published by CRC Press. This book was released on 2020-09-21 with total page 942 pages. Available in PDF, EPUB and Kindle. Book excerpt: Statistical Foundations of Data Science gives a thorough introduction to commonly used statistical models, contemporary statistical machine learning techniques and algorithms, along with their mathematical insights and statistical theories. It aims to serve as a graduate-level textbook and a research monograph on high-dimensional statistics, sparsity and covariance learning, machine learning, and statistical inference. It includes ample exercises that involve both theoretical studies as well as empirical applications. The book begins with an introduction to the stylized features of big data and their impacts on statistical analysis. It then introduces multiple linear regression and expands the techniques of model building via nonparametric regression and kernel tricks. It provides a comprehensive account on sparsity explorations and model selections for multiple regression, generalized linear models, quantile regression, robust regression, hazards regression, among others. High-dimensional inference is also thoroughly addressed and so is feature screening. The book also provides a comprehensive account on high-dimensional covariance estimation, learning latent factors and hidden structures, as well as their applications to statistical estimation, inference, prediction and machine learning problems. It also introduces thoroughly statistical machine learning theory and methods for classification, clustering, and prediction. These include CART, random forests, boosting, support vector machines, clustering algorithms, sparse PCA, and deep learning.

Book The Linear Regression Model Under Test

Download or read book The Linear Regression Model Under Test written by W. Kraemer and published by Springer Science & Business Media. This book was released on 2012-12-06 with total page 195 pages. Available in PDF, EPUB and Kindle. Book excerpt: This monograph grew out of joint work with various dedicated colleagues and students at the Vienna Institute for Advanced Studies. We would probably never have begun without the impetus of Johann Maurer, who for some time was the spiritus rector behind the Institute's macromodel of the Austrian economy. Manfred Deistler provided sustained stimulation for our research through many discussions in his econometric research seminar. Similar credits are due to Adrian Pagan, Roberto Mariano and Garry Phillips, the econometrics guest professors at the Institute in the 1982 - 1984 period, who through their lectures and advice have contributed greatly to our effort. Hans SchneeweiB offered helpful comments on an earlier version of the manuscript, and Benedikt Poetscher was always willing to lend a helping . hand when we had trouble with the mathematics of the tests. Needless to say that any errors are our own. Much of the programming for the tests and for the Monte Carlo experiments was done by Petr Havlik, Karl Kontrus and Raimund Alt. Without their assistance, our research project would have been impossible. Petr Havlik and Karl Kontrus in addition. read and criticized portions of the manuscript, and were of great help in reducing our error rate. Many of the more theoretical results in this monograph would never have come to light without the mathematical expertise of Werner Ploberger, who provided most of the statistical background of the chapter on testing for structural change . .

Book Linear Regression

    Book Details:
  • Author : Jürgen Groß
  • Publisher : Springer Science & Business Media
  • Release : 2012-12-06
  • ISBN : 364255864X
  • Pages : 400 pages

Download or read book Linear Regression written by Jürgen Groß and published by Springer Science & Business Media. This book was released on 2012-12-06 with total page 400 pages. Available in PDF, EPUB and Kindle. Book excerpt: The book covers the basic theory of linear regression models and presents a comprehensive survey of different estimation techniques as alternatives and complements to least squares estimation. Proofs are given for the most relevant results, and the presented methods are illustrated with the help of numerical examples and graphics. Special emphasis is placed on practicability and possible applications. The book is rounded off by an introduction to the basics of decision theory and an appendix on matrix algebra.

Book Multivariate Reduced Rank Regression

Download or read book Multivariate Reduced Rank Regression written by Gregory C. Reinsel and published by Springer Nature. This book was released on 2022-11-30 with total page 420 pages. Available in PDF, EPUB and Kindle. Book excerpt: This book provides an account of multivariate reduced-rank regression, a tool of multivariate analysis that enjoys a broad array of applications. In addition to a historical review of the topic, its connection to other widely used statistical methods, such as multivariate analysis of variance (MANOVA), discriminant analysis, principal components, canonical correlation analysis, and errors-in-variables models, is also discussed. This new edition incorporates Big Data methodology and its applications, as well as high-dimensional reduced-rank regression, generalized reduced-rank regression with complex data, and sparse and low-rank regression methods. Each chapter contains developments of basic theoretical results, as well as details on computational procedures, illustrated with numerical examples drawn from disciplines such as biochemistry, genetics, marketing, and finance. This book is designed for advanced students, practitioners, and researchers, who may deal with moderate and high-dimensional multivariate data. Because regression is one of the most popular statistical methods, the multivariate regression analysis tools described should provide a natural way of looking at large (both cross-sectional and chronological) data sets. This book can be assigned in seminar-type courses taken by advanced graduate students in statistics, machine learning, econometrics, business, and engineering.

Book Minimax Estimation in Regression and Random Censorship Models

Download or read book Minimax Estimation in Regression and Random Censorship Models written by Eduard N. Belitser and published by . This book was released on 2000 with total page 148 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book On minimax estimation in linear regression models with ellipsoidal constraints

Download or read book On minimax estimation in linear regression models with ellipsoidal constraints written by Norbert Christopeit and published by . This book was released on 1991 with total page 34 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax estimation in linear regression with convex polyhedral constraints

Download or read book Minimax estimation in linear regression with convex polyhedral constraints written by Peter Stahlecker and published by . This book was released on 1990 with total page 32 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax Ridge Regression

Download or read book Minimax Ridge Regression written by Lawrence C. Peele and published by . This book was released on 1980 with total page 27 pages. Available in PDF, EPUB and Kindle. Book excerpt: This work examined minimax linear estimation in multiple linear regression. The application of minimax estimation to regression led to the development of ridge regression estimators with stochastic ridge parameters. These estimators were seen to be invariant under linear transformation; a property which has not been established for other ridge estimators. These minimax-motivated estimators were examined in several simulation studies. In particular, flaws in other simulation studies of ridge estimators were depicted. Consequently, an improved simulation procedure was used. It was observed from these studies that, contrary to published statements, a ridge estimator can be considerably superior to the ordinary least squares estimator, especially when high pairwise correlations exist among the regression variables. Robustness considerations were used to suggest a requirement that a 'good' generalized ridge regression estimator should satisfy. (Author).

Book   A   Note on Minimax Estimation in Linear Regression

Download or read book A Note on Minimax Estimation in Linear Regression written by Karsten Schmidt and published by . This book was released on 1991 with total page 28 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Principles and Methods for Data Science

Download or read book Principles and Methods for Data Science written by and published by Elsevier. This book was released on 2020-05-28 with total page 498 pages. Available in PDF, EPUB and Kindle. Book excerpt: Principles and Methods for Data Science, Volume 43 in the Handbook of Statistics series, highlights new advances in the field, with this updated volume presenting interesting and timely topics, including Competing risks, aims and methods, Data analysis and mining of microbial community dynamics, Support Vector Machines, a robust prediction method with applications in bioinformatics, Bayesian Model Selection for Data with High Dimension, High dimensional statistical inference: theoretical development to data analytics, Big data challenges in genomics, Analysis of microarray gene expression data using information theory and stochastic algorithm, Hybrid Models, Markov Chain Monte Carlo Methods: Theory and Practice, and more. - Provides the authority and expertise of leading contributors from an international board of authors - Presents the latest release in the Handbook of Statistics series - Updated release includes the latest information on Principles and Methods for Data Science

Book Minimax Estimation in Linear Regression with Convex Polyhedral Constraints

Download or read book Minimax Estimation in Linear Regression with Convex Polyhedral Constraints written by P. Stahlecker and published by . This book was released on 1990 with total page 16 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Quasi minimax estimation in the linear regression model

Download or read book Quasi minimax estimation in the linear regression model written by Peter Stahlecker and published by . This book was released on 1984 with total page 20 pages. Available in PDF, EPUB and Kindle. Book excerpt: