Skip to Content

Search: {{$root.lsaSearchQuery.q}}, Page {{$root.page}}

Chia Chye Yee

Wednesday, October 30, 2013
4:00 AM
438 West Hall

Sparse Bayesian Learning: Regression and Hyperspectral Application

In high dimensional statistics, the frequentist approach to the problem, such as lasso and its derivatives, is very successful and well established in terms of results and asymptotic theory. Meanwhile, bayesian methods involve using different priors as a method of implementing different penalization structure. Each method has advantages and shortcomings. Although computation for lasso is highly efficient, computing standard errors remains a problem. While bayesian methods are highly flexible due to the diverse set of priors available, computation of the posterior often involves MCMC. The main topic of discussion today is sparse bayesian learning (SBL) which is introduced by Tipping (2001) is an empirical bayes method which combines both of these methods in attempting to solve high dimensional problems. This presentation aims to cover the analysis of the method in terms of the estimation error and its application in Hyperspectral unmixing.