EBookClubs

Read Books & Download eBooks Full Online

EBookClubs

Read Books & Download eBooks Full Online

Book Handbook of Bayesian Variable Selection

Download or read book Handbook of Bayesian Variable Selection written by Mahlet G. Tadesse and published by CRC Press. This book was released on 2021-12-24 with total page 762 pages. Available in PDF, EPUB and Kindle. Book excerpt: Bayesian variable selection has experienced substantial developments over the past 30 years with the proliferation of large data sets. Identifying relevant variables to include in a model allows simpler interpretation, avoids overfitting and multicollinearity, and can provide insights into the mechanisms underlying an observed phenomenon. Variable selection is especially important when the number of potential predictors is substantially larger than the sample size and sparsity can reasonably be assumed. The Handbook of Bayesian Variable Selection provides a comprehensive review of theoretical, methodological and computational aspects of Bayesian methods for variable selection. The topics covered include spike-and-slab priors, continuous shrinkage priors, Bayes factors, Bayesian model averaging, partitioning methods, as well as variable selection in decision trees and edge selection in graphical models. The handbook targets graduate students and established researchers who seek to understand the latest developments in the field. It also provides a valuable reference for all interested in applying existing methods and/or pursuing methodological extensions. Features: Provides a comprehensive review of methods and applications of Bayesian variable selection. Divided into four parts: Spike-and-Slab Priors; Continuous Shrinkage Priors; Extensions to various Modeling; Other Approaches to Bayesian Variable Selection. Covers theoretical and methodological aspects, as well as worked out examples with R code provided in the online supplement. Includes contributions by experts in the field. Supported by a website with code, data, and other supplementary material

Book Bayesian Variable Selection with Spike and slab Priors

Download or read book Bayesian Variable Selection with Spike and slab Priors written by Anjali Agarwal and published by . This book was released on 2016 with total page 90 pages. Available in PDF, EPUB and Kindle. Book excerpt: A major focus of intensive methodological research in recent times has been on knowledge extraction from high-dimensional datasets made available by advances in research technologies. Coupled with the growing popularity of Bayesian methods in statistical analysis, a range of new techniques have evolved that allow innovative model-building and inference in high-dimensional settings – an important one among these being Bayesian variable selection (BVS). The broad goal of this thesis is to explore different BVS methods and demonstrate their application in high-dimensional psychological data analysis. In particular, the focus will be on a class of sparsity-enforcing priors called 'spike-and-slab' priors which are mixture priors on regression coefficients with density functions that are peaked at zero (the 'spike') and also have large probability mass for a wide range of non-zero values (the 'slab'). It is demonstrated that BVS with spike-and-slab priors achieved a reasonable degree of dimensionality-reduction when applied to a psychiatric dataset in a logistic regression setup. BVS performance was also compared to that of LASSO (least absolute shrinkage and selection operator), a popular machine-learning technique, as reported in Ahn et al.(2016). The findings indicate that BVS with a spike-and-slab prior provides a competitive alternative to machine-learning methods, with the additional advantages of ease of interpretation and potential to handle more complex models. In conclusion, this thesis serves to add a new cutting-edge technique to the lab’s tool-shed and helps introduce Bayesian variable-selection to researchers in Cognitive Psychology where it still remains relatively unexplored as a dimensionality-reduction tool.

Book Handbook of Bayesian Variable Selection

Download or read book Handbook of Bayesian Variable Selection written by Mahlet Tadesse and published by . This book was released on 2021-12 with total page pages. Available in PDF, EPUB and Kindle. Book excerpt: "Bayesian variable selection has experienced substantial developments over the past 30 years with the proliferation of large data sets. Identifying relevant variables to include in a model allows simpler interpretation, avoids overfitting and multicollinearity, and can provide insights into the mechanisms underlying an observed phenomenon. Variable selection is especially important when the number of potential predictors is substantially larger than the sample size and sparsity can reasonably be assumed. The Handbook of Bayesian Variable Selection provides a comprehensive review of theoretical, methodological and computational aspects of Bayesian methods for variable selection. The topics covered include spike-and-slab priors, continuous shrinkage priors, Bayes factors, Bayesian model averaging, partitioning methods, as well as variable selection in decision trees and edge selection in graphical models. The handbook targets graduate students and established researchers who seek to understand the latest developments in the field. It also provides a valuable reference for all interested in applying existing methods and/or pursuing methodological extensions"--

Book Bayesian Variable Selection and Estimation

Download or read book Bayesian Variable Selection and Estimation written by Xiaofan Xu and published by . This book was released on 2014 with total page 76 pages. Available in PDF, EPUB and Kindle. Book excerpt: The paper considers the classical Bayesian variable selection problem and an important subproblem in which grouping information of predictors is available. We propose the Half Thresholding (HT) estimator for simultaneous variable selection and estimation with shrinkage priors. Under orthogonal design matrix, variable selection consistency and asymptotic distribution of HT estimators are investigated and the oracle property is established with Three Parameter Beta Mixture of Normals (TPBN) priors. We then revisit Bayesian group lasso and use spike and slab priors for variable selection at the group level. In the process, the connection of our model with penalized regression is demonstrated, and the role of posterior median for thresholding is pointed out. We show that the posterior median estimator has the oracle property for group variable selection and estimation under orthogonal design while the group lasso has suboptimal asymptotic estimation rate when variable selection consistency is achieved. Next we consider Bayesian sparse group lasso again with spike and slab priors to select variables both at the group level and also within the group, and develop the necessary algorithm for its implementation. We demonstrate via simulation that the posterior median estimator of our spike and slab models has excellent performance for both variable selection and estimation.

Book Economic Analysis of the Digital Economy

Download or read book Economic Analysis of the Digital Economy written by Avi Goldfarb and published by University of Chicago Press. This book was released on 2015-05-08 with total page 510 pages. Available in PDF, EPUB and Kindle. Book excerpt: There is a small and growing literature that explores the impact of digitization in a variety of contexts, but its economic consequences, surprisingly, remain poorly understood. This volume aims to set the agenda for research in the economics of digitization, with each chapter identifying a promising area of research. "Economics of Digitization "identifies urgent topics with research already underway that warrant further exploration from economists. In addition to the growing importance of digitization itself, digital technologies have some features that suggest that many well-studied economic models may not apply and, indeed, so many aspects of the digital economy throw normal economics in a loop. "Economics of Digitization" will be one of the first to focus on the economic implications of digitization and to bring together leading scholars in the economics of digitization to explore emerging research.

Book Bayesian Variable Selection and Functional Data Analysis

Download or read book Bayesian Variable Selection and Functional Data Analysis written by Asish Kumar Banik and published by . This book was released on 2019 with total page 157 pages. Available in PDF, EPUB and Kindle. Book excerpt: High-dimensional statistics is one of the most studied topics in the field of statistics. The most interesting problem to arise in the last 15 years is variable selection or subset selection. Variable selection is a strong statistical tool that can be explored in functional data analysis. In the first part of this thesis, we implement a Bayesian variable selection method for automatic knot selection. We propose a spike-and-slab prior on knots and formulate a conjugate stochastic search variable selection for significant knots. The computation is substantially faster than existing knot selection methods, as we use Metropolis-Hastings algorithms and a Gibbs sampler for estimation. This work focuses on a single nonlinear covariate, modeled as regression splines. In the next stage, we study Bayesian variable selection in additive models with high-dimensional predictors. The selection of nonlinear functions in models is highly important in recent research, and the Bayesian method of selection has more advantages than contemporary frequentist methods. Chapter 2 examines Bayesian sparse group lasso theory based on spike-and-slab priors to determine its applicability for variable selection and function estimation in nonparametric additive models.The primary objective of Chapter 3 is to build a classification method using longitudinal volumetric magnetic resonance imaging (MRI) data from five regions of interest (ROIs). A functional data analysis method is used to handle the longitudinal measurement of ROIs, and the functional coefficients are later used in the classification models. We propose a P\\'olya-gamma augmentation method to classify normal controls and diseased patients based on functional MRI measurements. We obtain fast-posterior sampling by avoiding the slow and complicated Metropolis-Hastings algorithm. Our main motivation is to determine the important ROIs that have the highest separating power to classify our dichotomous response. We compare the sensitivity, specificity, and accuracy of the classification based on single ROIs and with various combinations of them. We obtain a sensitivity of over 85% and a specificity of around 90% for most of the combinations.Next, we work with Bayesian classification and selection methodology. The main goal of Chapter 4 is to employ longitudinal trajectories in a significant number of sub-regional brain volumetric MRI data as statistical predictors for Alzheimer's disease (AD) classification. We use logistic regression in a Bayesian framework that includes many functional predictors. The direct sampling of regression coefficients from the Bayesian logistic model is difficult due to its complicated likelihood function. In high-dimensional scenarios, the selection of predictors is paramount with the introduction of either spike-and-slab priors, non-local priors, or Horseshoe priors. We seek to avoid the complicated Metropolis-Hastings approach and to develop an easily implementable Gibbs sampler. In addition, the Bayesian estimation provides proper estimates of the model parameters, which are also useful for building inference. Another advantage of working with logistic regression is that it calculates the log of odds of relative risk for AD compared to normal control based on the selected longitudinal predictors, rather than simply classifying patients based on cross-sectional estimates. Ultimately, however, we combine approaches and use a probability threshold to classify individual patients. We employ 49 functional predictors consisting of volumetric estimates of brain sub-regions, chosen for their established clinical significance. Moreover, the use of spike-and-slab priors ensures that many redundant predictors are dropped from the model.Finally, we present a new approach of Bayesian model-based clustering for spatiotemporal data in chapter 5 . A simple linear mixed model (LME) derived from a functional model is used to model spatiotemporal cerebral white matter data extracted from healthy aging individuals. LME provides us with prior information for spatial covariance structure and brain segmentation based on white matter intensity. This motivates us to build stochastic model-based clustering to group voxels considering their longitudinal and location information. The cluster-specific random effect causes correlation among repeated measures. The problem of finding partitions is dealt with by imposing prior structure on cluster partitions in order to derive a stochastic objective function.

Book Handbook of Bayesian Variable Selection

Download or read book Handbook of Bayesian Variable Selection written by Mahlet G. Tadesse and published by CRC Press. This book was released on 2021-12-24 with total page 491 pages. Available in PDF, EPUB and Kindle. Book excerpt: Bayesian variable selection has experienced substantial developments over the past 30 years with the proliferation of large data sets. Identifying relevant variables to include in a model allows simpler interpretation, avoids overfitting and multicollinearity, and can provide insights into the mechanisms underlying an observed phenomenon. Variable selection is especially important when the number of potential predictors is substantially larger than the sample size and sparsity can reasonably be assumed. The Handbook of Bayesian Variable Selection provides a comprehensive review of theoretical, methodological and computational aspects of Bayesian methods for variable selection. The topics covered include spike-and-slab priors, continuous shrinkage priors, Bayes factors, Bayesian model averaging, partitioning methods, as well as variable selection in decision trees and edge selection in graphical models. The handbook targets graduate students and established researchers who seek to understand the latest developments in the field. It also provides a valuable reference for all interested in applying existing methods and/or pursuing methodological extensions. Features: Provides a comprehensive review of methods and applications of Bayesian variable selection. Divided into four parts: Spike-and-Slab Priors; Continuous Shrinkage Priors; Extensions to various Modeling; Other Approaches to Bayesian Variable Selection. Covers theoretical and methodological aspects, as well as worked out examples with R code provided in the online supplement. Includes contributions by experts in the field. Supported by a website with code, data, and other supplementary material

Book Bayesian Variable Selection

Download or read book Bayesian Variable Selection written by Guiling Shi and published by . This book was released on 2017 with total page 100 pages. Available in PDF, EPUB and Kindle. Book excerpt:

Book Bayesian Selection Model with Shrinking Priors for Nonignorable Missingness

Download or read book Bayesian Selection Model with Shrinking Priors for Nonignorable Missingness written by Juan Diego Vera and published by . This book was released on 2023 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: This study investigates the effectiveness of Bayesian variable selection (BVS) procedures in dealing with missing not at random (MNAR) data for identification in selection models. Three BVS-adapted selection models, namely Bayesian LASSO, horseshoe prior, and spike-and-slab prior, were compared, along with established missing data methods such as a model that assumes a missing at random (MAR) process and full-selection model. The results indicate that the spike-and-slab prior consistently outperformed other BVS methods in terms of accuracy and bias for various parameters, including slope estimates, residual variance, and intercept. When compared with the full-selection model, the spike-and-slab model exhibited superior performance across all parameters based on mean squared error (MSE) results.Although the MAR and spike-and-slab models showed comparable performance for slope estimates, the spike-and-slab model consistently outperformed the MAR model in estimating residual variance and intercept. This comparable performance is attributed to the bias-variance tradeoff. The MAR model, while biased, demonstrated efficiency by estimating fewer parameters than selection models and obtaining robust support from the observed data. On the other hand, the spike-and-slab model outperformed the full-selection model, even when the full-selection model aligned with the true data-generating model. The adaptation of BVS to selection models, particularly through the spike-and-slab method, yielded promising results with unbiased estimates under various conditions. However, it is important to acknowledge that this study represents an initial exploration of this subject, and its scope was inherently limited. Finally, the BVS adaptations to the selection model was illustrated with data from a clinical-trial study.

Book Bayesian Variable Selection with Applications to Neuroimaging Data

Download or read book Bayesian Variable Selection with Applications to Neuroimaging Data written by Shariq Mohammed and published by . This book was released on 2018 with total page 135 pages. Available in PDF, EPUB and Kindle. Book excerpt: In this dissertation, we discuss Bayesian modeling approaches for identifying brain regions that respond to certain stimulus and use them to classify subjects. We specifically deal with multi-subject electroencephalography (EEG) data where the responses are binary, and the covariates are matrices, with measurements taken for each subject at different locations across multiple time points. EEG data has a complex structure with both spatial and temporal attributes to it. We use a divide and conquer strategy to build multiple local models, that is, one model at each time point separately both, to avoid the curse of dimensionality and to achieve computational feasibility. Within each local model, we use Bayesian variable selection approaches to identify the locations which respond to a stimulus. We use continuous spike and slab prior, which has inherent variable selection properties. We initially demonstrate the local Bayesian modeling approach which is computationally inexpensive, where the estimation for each local modeling could be conducted in parallel. We use MCMC sampling procedures for parameter estimation. We also discuss a two-stage variable selection approach based on thresholding using the complexity parameter built within the model. A prediction strategy is built utilizing the temporal structure between local models. The spatial correlation is incorporated within the local Bayesian modeling to improve the inference. The temporal characteristic of the data is incorporated through the prior structure by learning from the local models estimated at previous time points. Variable selection is done via clustering of the locations based on their activation time. We then use a weighted prediction strategy to pool information from the local spatial models to make a final prediction. Since the EEG data has both spatial and temporal correlations acting simultaneously, we enrich our local Bayesian modeling by incorporating both correlations through a Kronecker product of the spatial and temporal correlation structures. We develop a highly scalable estimation approach to deal with the ultra-huge number of parameters in the model. We demonstrate the efficiency of estimation using the scalable algorithm by performing simulation studies. We also study the performance of these models through a case study on multi-subject EEG data.

Book Bayesian Variable Selection for Non Gaussian Data Using Global Local Shrinkage Priors and the Multivaraite Logit Beta Distribution

Download or read book Bayesian Variable Selection for Non Gaussian Data Using Global Local Shrinkage Priors and the Multivaraite Logit Beta Distribution written by Hongyu Wu and published by . This book was released on 2022 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Variable selection methods have become an important and growing problem in Bayesian analysis. The literature on Bayesian variable selection methods tends to be applied to a single response- type, and more typically, a continuous response-type, where it is assumed that the data is Gaus- sian/symmetric. In this dissertation, we develop a novel global-local shrinkage prior in non- symmetric settings and multiple response-types settings by combining the perspectives of global- local shrinkage and the conjugate multivaraite distribution. In Chapter 2, we focus on the problem of variable selection when the data is possibly non- symmetric continuous-valued. We propose modeling continuous-valued data and the coefficient vector with the multivariate logit-beta (MLB) distribution. To perform variable selection in a Bayesian context we make use of shrinkage global-local priors to enforce sparsity. Specifically, they can be defined as a Gaussian scale mixture of a global shrinkage parameter and a local shrinkage parameter for a regression coefficient. We provide a technical discussion that illustrates that our use of the multivariate logit-beta distribution under a P ́olya-Gamma augmentation scheme has an explicit connection to a well-known global-local shrinkage method (id est, the horseshoe prior) and extends it to possibly non-symmetric data. Moreover, our method can be implemented using an efficient block Gibbs sampler. Evidence of improvements in terms of mean squared error and variable selection as compared to the standard implementation of the horseshoe prior for skewed data settings is provided in simulated and real data examples. In Chapter 3, we direct our attention to the canonical variable selection problem in multiple response-types settings, where the observed dataset consists of multiple response-types (e.g., con- tinuous, count-valued, Bernoulli trials, et cetera). We propose the same global-local shrinkage prior in Chapter 2 but for multiple response-types datasets. The implementation of our Bayesian variable selection method to such data types is straightforward given the fact that the multivariate logit-beta prior is the conjugate prior for several members from the natural exponential family of distributions, which leads to the binomial/beta and negative binomial/beta hierarchical models. Our proposed model not just allows the estimation and selection of independent regression coefficients, but also those of shared regression coefficients across-response-types, which can be used to explicitly model dependence in spatial and time-series settings. An efficient block Gibbs sampler is developed, which is found to be effective in obtaining accurate estimates and variable selection results in simulation studies and an analysis of public health and financial costs from natural disasters in the U.S.

Book Gaussian Markov Random Fields

Download or read book Gaussian Markov Random Fields written by Havard Rue and published by CRC Press. This book was released on 2005-02-18 with total page 280 pages. Available in PDF, EPUB and Kindle. Book excerpt: Gaussian Markov Random Field (GMRF) models are most widely used in spatial statistics - a very active area of research in which few up-to-date reference works are available. This is the first book on the subject that provides a unified framework of GMRFs with particular emphasis on the computational aspects. This book includes extensive case-studie

Book Consistent Bayesian Learning for Neural Network Models

Download or read book Consistent Bayesian Learning for Neural Network Models written by Sanket Rajendra Jantre and published by . This book was released on 2022 with total page 0 pages. Available in PDF, EPUB and Kindle. Book excerpt: Bayesian framework adapted for neural network learning, Bayesian neural networks, have received widespread attention and successfully applied to various applications. Bayesian inference for neural networks promises improved predictions with reliable uncertainty estimates, robustness, principled model comparison, and decision-making under uncertainty. In this dissertation, we propose novel theoretically consistent Bayesian neural network models and provide their computationally efficient posterior inference algorithms.In Chapter 2, we introduce a Bayesian quantile regression neural network assuming an asymmetric Laplace distribution for the response variable. The normal-exponential mixturere presentation of the asymmetric Laplace density is utilized to derive the Gibbs sampling coupled with Metropolis-Hastings algorithm for the posterior inference. We establish the posterior consistency under a misspecified asymmetric Laplace density model. We illustrate the proposed method with simulation studies and real data examples.Traditional Bayesian learning methods are limited by their scalability to large data and feature spaces due to the expensive inference approaches, however recent developments in variational inference techniques and sparse learning have brought renewed interest to this area. Sparse deep neural networks have proven to be efficient for predictive model building in large-scale studies. Although several works have studied theoretical and numerical properties of sparse neural architectures, they have primarily focused on the edge selection.In Chapter 3, we propose a sparse Bayesian technique using spike-and-slab Gaussian prior to allow for automatic node selection. The spike-and-slab prior alleviates the need of an ad-hoc thresholding rule for pruning. In addition, we adopt a variational Bayes approach to circumvent the computational challenges of traditional Markov chain Monte Carlo implementation. In the context of node selection, we establish the variational posterior consistency together with the layer-wise characterization of prior inclusion probabilities. We empirically demonstrate that our proposed approach outperforms the edge selection method in computational complexity with similar or better predictive performance.The structured sparsity (e.g. node sparsity) in deep neural networks provides low latency inference, higher data throughput, and reduced energy consumption. Alternatively, there is a vast albeit growing literature demonstrating shrinkage efficiency and theoretical optimality in linear models of two sparse parameter estimation techniques: lasso and horseshoe. In Chapter 4, we propose structurally sparse Bayesian neural networks which systematically prune excessive nodes with (i) Spike-and-Slab Group Lasso, and (ii) Spike-and-Slab Group Horseshoe priors, and develop computationally tractable variational inference We demonstrate the competitive performance of our proposed models compared to the Bayesian baseline models in prediction accuracy, model compression, and inference latency.Deep neural network ensembles that appeal to model diversity have been used successfully to improve predictive performance and model robustness in several applications. However, most ensembling techniques require multiple parallel and costly evaluations and have been proposed primarily with deterministic models. In Chapter 5, we propose sequential ensembling of dynamic Bayesian neural subnetworks to generate diverse ensemble in a single forward pass. The ensembling strategy consists of an exploration phase that finds high-performing regions of the parameter space and multiple exploitation phases that effectively exploit the compactness of the sparse model to quickly converge to different minima in the energy landscape corresponding to high-performing subnetworks yielding diverse ensembles. We empirically demonstrate that our proposed approach surpasses the baselines of the dense frequentist and Bayesian ensemble models in prediction accuracy, uncertainty estimation, and out-of-distribution robustness. Furthermore, we found that our approach produced the most diverse ensembles compared to the approaches with a single forward pass and even compared to the approaches with multiple forward passes in some cases.

Book Flexible Bayesian Regression Modelling

Download or read book Flexible Bayesian Regression Modelling written by Yanan Fan and published by Academic Press. This book was released on 2019-10-30 with total page 302 pages. Available in PDF, EPUB and Kindle. Book excerpt: Flexible Bayesian Regression Modeling is a step-by-step guide to the Bayesian revolution in regression modeling, for use in advanced econometric and statistical analysis where datasets are characterized by complexity, multiplicity, and large sample sizes, necessitating the need for considerable flexibility in modeling techniques. It reviews three forms of flexibility: methods which provide flexibility in their error distribution; methods which model non-central parts of the distribution (such as quantile regression); and finally models that allow the mean function to be flexible (such as spline models). Each chapter discusses the key aspects of fitting a regression model. R programs accompany the methods. This book is particularly relevant to non-specialist practitioners with intermediate mathematical training seeking to apply Bayesian approaches in economics, biology, finance, engineering and medicine. Introduces powerful new nonparametric Bayesian regression techniques to classically trained practitioners Focuses on approaches offering both superior power and methodological flexibility Supplemented with instructive and relevant R programs within the text Covers linear regression, nonlinear regression and quantile regression techniques Provides diverse disciplinary case studies for correlation and optimization problems drawn from Bayesian analysis ‘in the wild’