Date of Award


Degree Name

Master of Science


Mathematical Sciences


Sangjin Kim


In recent times, variable selection in high-dimensional data has become a challenging prob- lem. We investigate here a popular but classical variable screening method, the Back- ward Elimination (BE) in a high dimensional setup (small-n-large P). The BE method as a variable screening method reduces the dimension of small-n-large P data into a lower dimensional data and then established shrinkage methods such as: LASSO, SCAD and MCP can be applied directly. To overcome the problems in high dimensional data, Chen and Chen (2008) recently developed a family of Extended Bayesian Information Criterion (EBIC) which is consistent with finite sample properties (Chen and Chen, 2008) which we used in this study to select the best candidate model from the models generated by the proposed BE method. We compare the BE with other screening methods such as: Sure Independence Screening(SIS), Iterative Sure Independence Screening and Forward Regres- sion (FR) in simulation studies and real-data analysis to illustrate the selection consistency of our proposed BE method. Our numerical analysis reveals that the BE with EBIC can identify all important variables with high coverage probability, low false discovery rate and a very good model size with high signal-to-noise.




Received from ProQuest

File Size

82 pages

File Format


Rights Holder

Sophia Korkor Foli