EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Advances in Data Science

Download or read book Advances in Data Science written by Edwin Diday and published by John Wiley & Sons. This book was released on 2020-01-09 with total page 225 pages. Available in PDF, EPUB and Kindle. Book excerpt: Data science unifies statistics, data analysis and machine learning to achieve a better understanding of the masses of data which are produced today, and to improve prediction. Special kinds of data (symbolic, network, complex, compositional) are increasingly frequent in data science. These data require specific methodologies, but there is a lack of reference work in this field. Advances in Data Science fills this gap. It presents a collection of up-to-date contributions by eminent scholars following two international workshops held in Beijing and Paris. The 10 chapters are organized into four parts: Symbolic Data, Complex Data, Network Data and Clustering. They include fundamental contributions, as well as applications to several domains, including business and the social sciences.

Book Dimension Reduction with Inverse Regression

Download or read book Dimension Reduction with Inverse Regression written by Liqiang Ni and published by . This book was released on 2003 with total page 312 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Dimension Reduction Via Inverse Regression

Download or read book Dimension Reduction Via Inverse Regression written by Efstathia Bura and published by . This book was released on 1996 with total page 270 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Dimension Reduction Through Inverse Regression

Download or read book Dimension Reduction Through Inverse Regression written by Pawel Stryszak and published by . This book was released on 1995 with total page 376 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Sufficient Dimension Reduction

Download or read book Sufficient Dimension Reduction written by Bing Li and published by CRC Press. This book was released on 2018-04-27 with total page 362 pages. Available in PDF, EPUB and Kindle. Book excerpt: Sufficient dimension reduction is a rapidly developing research field that has wide applications in regression diagnostics, data visualization, machine learning, genomics, image processing, pattern recognition, and medicine, because they are fields that produce large datasets with a large number of variables. Sufficient Dimension Reduction: Methods and Applications with R introduces the basic theories and the main methodologies, provides practical and easy-to-use algorithms and computer codes to implement these methodologies, and surveys the recent advances at the frontiers of this field. Features Provides comprehensive coverage of this emerging research field. Synthesizes a wide variety of dimension reduction methods under a few unifying principles such as projection in Hilbert spaces, kernel mapping, and von Mises expansion. Reflects most recent advances such as nonlinear sufficient dimension reduction, dimension folding for tensorial data, as well as sufficient dimension reduction for functional data. Includes a set of computer codes written in R that are easily implemented by the readers. Uses real data sets available online to illustrate the usage and power of the described methods. Sufficient dimension reduction has undergone momentous development in recent years, partly due to the increased demands for techniques to process high-dimensional data, a hallmark of our age of Big Data. This book will serve as the perfect entry into the field for the beginning researchers or a handy reference for the advanced ones. The author Bing Li obtained his Ph.D. from the University of Chicago. He is currently a Professor of Statistics at the Pennsylvania State University. His research interests cover sufficient dimension reduction, statistical graphical models, functional data analysis, machine learning, estimating equations and quasilikelihood, and robust statistics. He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association. He is an Associate Editor for The Annals of Statistics and the Journal of the American Statistical Association.

Book Dimension Reduction

    Book Details:
  • Author : Christopher J. C. Burges
  • Publisher : Now Publishers Inc
  • Release : 2010
  • ISBN : 1601983786
  • Pages : 104 pages

Download or read book Dimension Reduction written by Christopher J. C. Burges and published by Now Publishers Inc. This book was released on 2010 with total page 104 pages. Available in PDF, EPUB and Kindle. Book excerpt: We give a tutorial overview of several foundational methods for dimension reduction. We divide the methods into projective methods and methods that model the manifold on which the data lies. For projective methods, we review projection pursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, canonical correlation analysis (CCA), kernel CCA, Fisher discriminant analysis, oriented PCA, and several techniques for sufficient dimension reduction. For the manifold methods, we review multidimensional scaling (MDS), landmark MDS, Isomap, locally linear embedding, Laplacian eigenmaps, and spectral clustering. Although the review focuses on foundations, we also provide pointers to some more modern techniques. We also describe the correlation dimension as one method for estimating the intrinsic dimension, and we point out that the notion of dimension can be a scale-dependent quantity. The Nystr m method, which links several of the manifold algorithms, is also reviewed. We use a publicly available dataset to illustrate some of the methods. The goal is to provide a self-contained overview of key concepts underlying many of these algorithms, and to give pointers for further reading.

Book Sufficient Dimension Reduction

Download or read book Sufficient Dimension Reduction written by Jingyue Lu and published by . This book was released on 2017 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: In regression analysis, it is difficult to uncover the dependence relationship between a response variable and a covariate vector when the dimension of the covariate vector is high. To reduce the dimension of the covariate vector, one approach is sufficient dimension reduction. Sufficient dimension reduction is based on the assumption that the response variable relates to only a few linear combinations of the covariate vector. Thus, by replacing the covariate vector with these linear combinations, sufficient dimension reduction achieves dimension reduction. The goal of sufficient dimension reduction is to estimate the space spanned by these linear combinations of the covariate vector. We denote this space by S. In this thesis, we give an introductory review on three important sufficient dimension reduction methods. They are Sliced Inverse Regression (SIR), Sliced Average Variance Estimate (SAVE) and Principle Hessian Directions (pHd). Li proposed SIR in 1991. SIR is a method that exploits the simplicity of the inverse regression. Given the univariate response variable and the high dimensional covariate, it is much easier to regress the covariate against the response variable than the other way around. Motivated by a theorem that connects forward regression and inverse regression, SIR estimates S using inverse regression lines. Since SIR uses first moments only, it fails when there exists symmetry dependence between the response variable and the covariate. To make up for this defect, Cook proposed SAVE in a comment on SIR in 1991. SAVE follows the general lines of SIR but uses second moments as well as first moments to estimate S. pHd is also a second moment method. Li developed pHd in 1992 based on the observation that the eigenvectors for the Hessian matrices of the regression function are closely related to the basis vectors of S. Therefore pHd provides an estimate of S by using these eigenvectors. To compare these methods, a simulation study is presented at the end. From the simulation results, SIR is the most efficient method and SAVE is the most time consuming method. Since SIR fails when symmetry dependence exists, we recommend pHd when symmetry dependence presents and SIR in other cases.

Book Elements of Dimensionality Reduction and Manifold Learning

Download or read book Elements of Dimensionality Reduction and Manifold Learning written by Benyamin Ghojogh and published by Springer Nature. This book was released on 2023-02-02 with total page 617 pages. Available in PDF, EPUB and Kindle. Book excerpt: Dimensionality reduction, also known as manifold learning, is an area of machine learning used for extracting informative features from data for better representation of data or separation between classes. This book presents a cohesive review of linear and nonlinear dimensionality reduction and manifold learning. Three main aspects of dimensionality reduction are covered: spectral dimensionality reduction, probabilistic dimensionality reduction, and neural network-based dimensionality reduction, which have geometric, probabilistic, and information-theoretic points of view to dimensionality reduction, respectively. The necessary background and preliminaries on linear algebra, optimization, and kernels are also explained to ensure a comprehensive understanding of the algorithms. The tools introduced in this book can be applied to various applications involving feature extraction, image processing, computer vision, and signal processing. This book is applicable to a wide audience who would like to acquire a deep understanding of the various ways to extract, transform, and understand the structure of data. The intended audiences are academics, students, and industry professionals. Academic researchers and students can use this book as a textbook for machine learning and dimensionality reduction. Data scientists, machine learning scientists, computer vision scientists, and computer scientists can use this book as a reference. It can also be helpful to statisticians in the field of statistical learning and applied mathematicians in the fields of manifolds and subspace analysis. Industry professionals, including applied engineers, data engineers, and engineers in various fields of science dealing with machine learning, can use this as a guidebook for feature extraction from their data, as the raw data in industry often require preprocessing. The book is grounded in theory but provides thorough explanations and diverse examples to improve the reader’s comprehension of the advanced topics. Advanced methods are explained in a step-by-step manner so that readers of all levels can follow the reasoning and come to a deep understanding of the concepts. This book does not assume advanced theoretical background in machine learning and provides necessary background, although an undergraduate-level background in linear algebra and calculus is recommended.

Book Regression Graphics

    Book Details:
  • Author : R. Dennis Cook
  • Publisher : John Wiley & Sons
  • Release : 2009-09-25
  • ISBN : 0470317779
  • Pages : 378 pages

Download or read book Regression Graphics written by R. Dennis Cook and published by John Wiley & Sons. This book was released on 2009-09-25 with total page 378 pages. Available in PDF, EPUB and Kindle. Book excerpt: An exploration of regression graphics through computer graphics. Recent developments in computer technology have stimulated new and exciting uses for graphics in statistical analyses. Regression Graphics, one of the first graduate-level textbooks on the subject, demonstrates how statisticians, both theoretical and applied, can use these exciting innovations. After developing a relatively new regression context that requires few scope-limiting conditions, Regression Graphics guides readers through the process of analyzing regressions graphically and assessing and selecting models. This innovative reference makes use of a wide range of graphical tools, including 2D and 3D scatterplots, 3D binary response plots, and scatterplot matrices. Supplemented by a companion ftp site, it features numerous data sets and applied examples that are used to elucidate the theory. Other important features of this book include: * Extensive coverage of a relatively new regression context based on dimension-reduction subspaces and sufficient summary plots * Graphical regression, an iterative visualization process for constructing sufficient regression views * Graphics for regressions with a binary response * Graphics for model assessment, including residual plots * Net-effects plots for assessing predictor contributions * Graphics for predictor and response transformations * Inverse regression methods * Access to a Web site of supplemental plots, data sets, and 3D color displays. An ideal text for students in graduate-level courses on statistical analysis, Regression Graphics is also an excellent reference for professional statisticians.

Book Application of Influence Function in Sufficient Dimension Reduction Models

Download or read book Application of Influence Function in Sufficient Dimension Reduction Models written by Prabha Shrestha and published by . This book was released on 2020 with total page 133 pages. Available in PDF, EPUB and Kindle. Book excerpt: In regression analysis, sufficient dimension reduction (SDR) models have gained significant popularity in the past three decades. While many methods have been proposed in the literature regarding the analysis of SDR models, the vast majority are of the type called inverse regression methods, pioneered by the sliced inverse regression method (Li \cite{Li91}). Most of these inverse regression methods rely on a matrix, commonly known as the central matrix. One of the main goals of the analysis of SDR models is the estimation of the central space. An influence function (IF) is a tool that analyzes the performance of a statistical estimator. In this dissertation, we focus on the application of IF on the analysis of SDR models. There are various inverse regression methods in existence. But none of them stands out in all cases, and it is not clear which central matrix one should use out of numerous options existing in the literature. We propose an IF-based approach for selection of a best performing central matrix from a class of inverse regression methods, and we extend this approach to the situation where the data are partially contaminated. Asymptotic results are established, and an extensive simulation study is conducted to examine the performance of the proposed algorithm. Another issue in an SDR model is the estimation of the dimension of its central space. Based on the IF, we propose a measure that combines the eigenvalues of the central matrix and an IF measure to estimate the dimension of the central space. In addition, we analyze the IF of the functional of Benasseni's measure for a specific inverse regression method, the $k{\text-th}$ moment method.

Book Dimension Reduction in Regression Analysis

Download or read book Dimension Reduction in Regression Analysis written by Zhishen Ye and published by . This book was released on 2001 with total page 282 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Robust Statistics

    Book Details:
  • Author : Ricardo A. Maronna
  • Publisher : John Wiley & Sons
  • Release : 2019-01-04
  • ISBN : 1119214688
  • Pages : 466 pages

Download or read book Robust Statistics written by Ricardo A. Maronna and published by John Wiley & Sons. This book was released on 2019-01-04 with total page 466 pages. Available in PDF, EPUB and Kindle. Book excerpt: A new edition of this popular text on robust statistics, thoroughly updated to include new and improved methods and focus on implementation of methodology using the increasingly popular open-source software R. Classical statistics fail to cope well with outliers associated with deviations from standard distributions. Robust statistical methods take into account these deviations when estimating the parameters of parametric models, thus increasing the reliability of fitted models and associated inference. This new, second edition of Robust Statistics: Theory and Methods (with R) presents a broad coverage of the theory of robust statistics that is integrated with computing methods and applications. Updated to include important new research results of the last decade and focus on the use of the popular software package R, it features in-depth coverage of the key methodology, including regression, multivariate analysis, and time series modeling. The book is illustrated throughout by a range of examples and applications that are supported by a companion website featuring data sets and R code that allow the reader to reproduce the examples given in the book. Unlike other books on the market, Robust Statistics: Theory and Methods (with R) offers the most comprehensive, definitive, and up-to-date treatment of the subject. It features chapters on estimating location and scale; measuring robustness; linear regression with fixed and with random predictors; multivariate analysis; generalized linear models; time series; numerical algorithms; and asymptotic theory of M-estimates. Explains both the use and theoretical justification of robust methods Guides readers in selecting and using the most appropriate robust methods for their problems Features computational algorithms for the core methods Robust statistics research results of the last decade included in this 2nd edition include: fast deterministic robust regression, finite-sample robustness, robust regularized regression, robust location and scatter estimation with missing data, robust estimation with independent outliers in variables, and robust mixed linear models. Robust Statistics aims to stimulate the use of robust methods as a powerful tool to increase the reliability and accuracy of statistical modelling and data analysis. It is an ideal resource for researchers, practitioners, and graduate students in statistics, engineering, computer science, and physical and social sciences.

Book Sliced Regression for Dimension Reduction

Download or read book Sliced Regression for Dimension Reduction written by Hansheng Wang and published by . This book was released on 2008 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: By slicing the region of the response (Li, 1991, SIR) and applying local kernel regression (Xia et al., 2002, MAVE) to each slice, a new dimension reduction method is proposed. Compared with the traditional inverse regression methods, e.g. sliced inverse regression (Li, 1991), the new method is free of the linearity condition (Li, 1991) and enjoys much improved estimation accuracy. Compared with the direct estimation methods (e.g., MAVE), the new method is much more robust against extreme values and can capture the entire central subspace (Cook, 1998b, CS) exhaustively. To determine the CS dimension, a consistent crossvalidation (CV) criterion is developed. Extensive numerical studies including one real example confirm our theoretical findings.

Book Statistical Methods in Molecular Biology

Download or read book Statistical Methods in Molecular Biology written by Heejung Bang and published by Humana. This book was released on 2016-08-23 with total page 636 pages. Available in PDF, EPUB and Kindle. Book excerpt: This progressive book presents the basic principles of proper statistical analyses. It progresses to more advanced statistical methods in response to rapidly developing technologies and methodologies in the field of molecular biology.

Book Analysis of Sparse Sufficient Dimension Reduction Models

Download or read book Analysis of Sparse Sufficient Dimension Reduction Models written by Yeshan Withanage and published by . This book was released on 2022 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Sufficient dimension reduction (SDR) in regression analysis with response variable y and predictor vector x is focused on reducing the dimension of x to a small number of linear combinations of the components in x. Since the introduction of the inverse regression method, SDR became a very active topic in the literature. When the dimension p of x is increasing with the number of observations n, the traditional SDR methods may not perform well. The purpose of this study is two fold, theoretical and empirical. In the theoretical analysis, I provide a proof for the consistency of a variable selection procedure in sparse single-index models (a special SDR model) through an inverse regression method called CUME. And for the case of multiple linear regression, I obtain the influence functions for estimators of the parameter vector with SCAD and MCP penalties by extending the idea of LASSO influence function. In the empirical aspect, I combine the LASSO-SIR algorithm with the influence function of LASSO to construct a new metric for choosing the penalty parameter for variable selection as an alternative approach to the usual cross-validation method. From the empirical analysis, it was found that the newly proposed influence function-based measure outperforms the traditional cross-validation method in a wide range of settings. Finally, I also propose an algorithm to estimate the structural dimension d of SDR models with large dimension p

Book On Sufficient Dimension Reduction Via Asymmetric Least Squares

Download or read book On Sufficient Dimension Reduction Via Asymmetric Least Squares written by Abdul-Nasah Soale and published by . This book was released on 2021 with total page 76 pages. Available in PDF, EPUB and Kindle. Book excerpt: Accompanying the advances in computer technology is an increase collection of high dimensional data in many scientific and social studies. Sufficient dimension reduction (SDR) is a statistical method that enable us to reduce the dimension ofpredictors without loss of regression information. In this dissertation, we introduce principal asymmetric least squares (PALS) as a unified framework for linear and nonlinear sufficient dimension reduction. Classical methods such as sliced inverse regression (Li, 1991) and principal support vector machines (Li, Artemiou and Li, 2011) often do not perform well in the presence of heteroscedastic error, while our proposal addresses this limitation by synthesizing different expectile levels. Through extensive numerical studies, we demonstrate the superior performance of PALS in terms of both computation time and estimation accuracy. For the asymptotic analysis of PALS for linear sufficient dimension reduction, we develop new tools to compute the derivative of an expectation of a non-Lipschitz function. PALS is not designed to handle symmetric link function between the response and the predictors. As a remedy, we develop expectile-assisted inverse regression estimation (EA-IRE) as a unified framework for moment-based inverse regression. We propose to first estimate the expectiles through kernel expectile regression, and then carry out dimension reduction based on random projections of the regression expectiles. Several popular inverse regression methods in the literature including slice inverse regression, slice average variance estimation, and directional regression are extended under this general framework. The proposed expectile-assisted methods outperform existing moment-based dimension reduction methods in both numerical studies and an analysis of the Big Mac data.