EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Minimax Estimation in the Linear Regression Model with Fuzzy Inequality Constraints

Download or read book Minimax Estimation in the Linear Regression Model with Fuzzy Inequality Constraints written by Henning Knautz and published by . This book was released on 2000 with total page 12 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book On minimax estimation in linear regression models with ellipsoidal constraints

Download or read book On minimax estimation in linear regression models with ellipsoidal constraints written by Norbert Christopeit and published by . This book was released on 1991 with total page 34 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax Estimation with Ellipsoid and Linear Inequality Constraints

Download or read book Minimax Estimation with Ellipsoid and Linear Inequality Constraints written by Jürgen Pilz and published by . This book was released on 1988 with total page 22 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax estimation in linear regression with convex polyhedral constraints

Download or read book Minimax estimation in linear regression with convex polyhedral constraints written by Peter Stahlecker and published by . This book was released on 1990 with total page 32 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax Estimation in Linear Regression with Convex Polyhedral Constraints

Download or read book Minimax Estimation in Linear Regression with Convex Polyhedral Constraints written by P. Stahlecker and published by . This book was released on 1990 with total page 16 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax Estimation in Regression and Random Censorship Models

Download or read book Minimax Estimation in Regression and Random Censorship Models written by Eduard N. Belitser and published by . This book was released on 2000 with total page 148 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax Estimation in Linear Regression Under Restrictions

Download or read book Minimax Estimation in Linear Regression Under Restrictions written by Helge Blaker and published by . This book was released on 1998 with total page 26 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Quasi minimax estimation in the linear regression model

Download or read book Quasi minimax estimation in the linear regression model written by Peter Stahlecker and published by . This book was released on 1984 with total page 20 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book A Note on Minimax estimation in Regression Models with Affine Restrictions

Download or read book A Note on Minimax estimation in Regression Models with Affine Restrictions written by Hilmar Drygas and published by . This book was released on 1988 with total page 72 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Approximate linear minimax estimation in regression analysis with ellipsoidal constraints

Download or read book Approximate linear minimax estimation in regression analysis with ellipsoidal constraints written by Peter Stahlecker and published by . This book was released on 1987 with total page 27 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Approximative Minimax Estimators in the Linear Regression Model

Download or read book Approximative Minimax Estimators in the Linear Regression Model written by Peter Stahlecker and published by . This book was released on 1985 with total page 40 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Least Squares Estimation Subject to Inequality Constraints

Download or read book Least Squares Estimation Subject to Inequality Constraints written by Michael E. Thomson and published by . This book was released on 1980 with total page 264 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Multiple Testing and Minimax Estimation in Sparse Linear Regression

Download or read book Multiple Testing and Minimax Estimation in Sparse Linear Regression written by Weijie Su and published by . This book was released on 2016 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: In many real-world statistical problems, we observe a response variable of interest together with a large number of potentially explanatory variables of which a majority may be irrelevant. For this type of problem, controlling the false discovery rate (FDR) guarantees that most of the selected variables, often termed discoveries in a scientific context, are truly explanatory and thus replicable. Inspired by ideas from the Benjamini-Hochberg procedure (BHq), this thesis proposes a new method named SLOPE to control the FDR in sparse high-dimensional linear regression. SLOPE is a computationally efficient procedure that works by regularizing the fitted coefficients according to their ranks: the higher the rank, the larger the penalty. This adaptive regularization is analogous to the BHq, which compares more significant p-values with more stringent thresholds. Under orthogonal designs, SLOPE with the BHq critical values is proven to control FDR at any given level. Moreover, we demonstrate empirically that this method also appears to have appreciable inferential properties under more general design matrices while offering substantial power. The thesis proceeds to explore the estimation properties of SLOPE. Although SLOPE was developed from a multiple testing viewpoint, we show the surprising result that it achieves optimal squared errors under Gaussian random designs. This optimality holds under a weak assumption on the l0-sparsity level of the underlying signals, and is sharp in the sense that this is the best possible error any estimator can achieve. An appealing feature is that SLOPE does not require any knowledge of the degree of sparsity, and yet automatically adapts to yield optimal total squared errors over a wide range of l0-sparsity classes. Finally, we conclude this thesis by focusing on Nesterov's accelerated scheme, which is integral to a fast algorithmic implementation of SLOPE. Specifically, we prove that, as the step size vanishes, this scheme converges in a rigorous sense to a second-order ordinary differential equation (ODE). This continuous time ODE allows for a better understanding of Nesterov's scheme, and thus it can serve as a tool for analyzing and generalizing this scheme. A fruitful application of this tool yields a family of schemes with similar convergence rates. The ODE interpretation also suggests restarting Nesterov's scheme leading to a new algorithm, which is proven to converge at a linear rate whenever the objective is strongly convex.

Book   A   Note on Minimax Estimation in Linear Regression

Download or read book A Note on Minimax Estimation in Linear Regression written by Karsten Schmidt and published by . This book was released on 1991 with total page 28 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Minimax Estimation with Structured Data

Download or read book Minimax Estimation with Structured Data written by Jan-Christian Klaus Hütter and published by . This book was released on 2019 with total page 299 pages. Available in PDF, EPUB and Kindle. Book excerpt: Modern statistics often deals with high-dimensional problems that suffer from poor performance guarantees and from the curse of dimensionality. In this thesis, we study how structural assumptions can be used to overcome these difficulties in several estimation problems, spanning three different areas of statistics: shape-constrained estimation, causal discovery, and optimal transport. In the area of shape-constrained estimation, we study the estimation of matrices, first under the assumption of bounded total-variation (TV) and second under the assumption that the underlying matrix is Monge, or supermodular. While the first problem has a long history in image denoising, the latter structure has so far been mainly investigated in the context of computer science and optimization. For TV denoising, we provide fast rates that are adaptive to the underlying edge sparsity of the image, as well as generalizations to other graph structures, including higher-dimensional grid-graphs. For the estimation of Monge matrices, we give near minimax rates for their estimation, including the case where latent permutations act on the rows and columns of the matrix. In the latter case, we also give two computationally efficient and consistent estimators. Moreover, we show how to obtain estimation rates in the related problem of estimating continuous totally positive distributions in 2D. In the area of causal discovery, we investigate a linear cyclic causal model and give an estimator that is near minimax optimal for causal graphs of bounded in-degree. In the area of optimal transport, we introduce the notion of the transport rank of a coupling and provide empirical and theoretical evidence that it can be used to significantly improve rates of estimation of Wasserstein distances and optimal transport plans. Finally, we give near minimax optimal rates for the estimation of smooth optimal transport maps based on a wavelet regularization of the semi-dual objective.