[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2808719.2808743acmconferencesArticle/Chapter ViewAbstractPublication PagesbcbConference Proceedingsconference-collections
research-article

Identification of significant genetic variants via SLOPE, and its extension to group SLOPE

Published: 09 September 2015 Publication History

Abstract

The method of Sorted L-One Penalized Estimation, abbreviated as SLOPE, is a novel sparse regression method for model selection introduced in a sequence of recent papers, [4], [3] and [7] by Bogdan, van den Berg, Sabatti, Su and Candes. It estimates the coefficients of a linear model that possibly has more unknown parameters than observations. In many settings the SLOPE method is shown to successfully control the false discovery rate (the proportion of the irrelevant among all selected predictors) at a user specified level. In this paper we evaluate its performance on genetic data, and show its superiority over LASSO which is a related and popular method. Often in genetic data sets, group structures among the predictor variables are given as prior knowledge, such as SNPs in a gene or genes in a pathway. Following this motivation we extend SLOPE in the spirit of Group LASSO to Group SLOPE, a method that can handle group structures between the predictor variables, which are ubiquitous in real genetic data. Our simulation results show that the proposed Group SLOPE method is capable of controlling the false discovery rate at a specified level. Moreover, our simulations show that compared to Group LASSO, Group SLOPE in general achieves a higher power as well as a lower false discovery rate.

References

[1]
A. Beck and M. Teboulle. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences, 2(1):183--202, 2009.
[2]
Y. Benjamini and Y. Hochberg. Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing. Journal of the Royal Statistical Society. Series B (Methodological), 57(1):289--300, 1995.
[3]
M. Bogdan, E. van den Berg, C. Sabatti, W. Su, and E. J. Candes. SLOPE -- Adaptive Variable Selection via Convex Optimization. ArXiv e-prints, July 2014.
[4]
M. Bogdan, E. van den Berg, W. Su, and E. Candes. Statistical estimation and testing via the sorted L1 norm. ArXiv e-prints, Oct. 2013.
[5]
P. Breheny and J. Huang. Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors. Statistics and Computing, 25(2):173--187, 2015.
[6]
D. Brzyski, M. Bogdan, and W. Su. Group slope -- adaptive selection of groups of predictors. Preprint, Aug. 2015.
[7]
E. Candes and W. Su. SLOPE is Adaptive to Unknown Sparsity and Asymptotically Minimax. ArXiv e-prints, Mar. 2015.
[8]
S. Cao, H. Qin, H.-W. Deng, and Y.-P. Wang. A unified sparse representation for sequence variant identification for complex traits. Genetic epidemiology, 38(8):671--679, 2014.
[9]
M. C. Cetin and A. Erar. Variable selection with akaike information criteria: a comparative study. Hacettepe Journal of Mathematics and Statistics, 31:89--97, 2002.
[10]
R.-H. Chung, W.-Y. Tsai, C.-H. Hsieh, K.-Y. Hung, C. A. Hsiung, and E. R. Hauser. Seqsimla2: Simulating correlated quantitative traits accounting for shared environmental effects in user-specified pedigree structure. Genetic Epidemiology, 39(1):20--24, 2015.
[11]
P. L. Combettes and J.-C. Pesquet. Proximal splitting methods in signal processing. In Fixed-point algorithms for inverse problems in science and engineering, pages 185--212. Springer New York, 2011.
[12]
J. Friedman, T. Hastie, and R. Tibshirani. A note on the group lasso and a sparse group lasso. ArXiv e-prints, Jan. 2010.
[13]
J. Friedman, T. Hastie, and R. Tibshirani. Regularization paths for generalized linear models via coordinate descent. Journal of Statistical Software, 33(1):1--22, 2010.
[14]
T. J. Hastie, R. J. Tibshirani, and J. H. Friedman. The Elements of Statistical Learning : Data Mining, Inference, and Prediction. Springer series in statistics. Springer, New York, 2009. Autres impressions : 2011 (corr.), 2013 (7e corr.).
[15]
J. Huang, P. Breheny, and S. Ma. A selective review of group selection in high-dimensional models. Statist. Sci., 27(4):481--499, 11 2012.
[16]
S. F. Schaffner, C. Foo, S. Gabriel, D. Reich, M. J. Daly, and D. Altshuler. Calibrating a coalescent simulation of human genome sequence variation. Genome Research, 15(11):1576--1583, 2005.
[17]
T. Sun and C.-H. Zhang. Scaled sparse linear regression. Biometrica, 99(4):879--898, 2012.
[18]
R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 58:267--288, 1994.
[19]
M. Yuan and Y. Lin. Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68(1):49--67, 2006.

Cited By

View all
  • (2024)Estimating a signal subspace in the presence of impulsive noiseStatistics and Computing10.1007/s11222-024-10528-z35:1Online publication date: 25-Nov-2024
  • (2018)A Sparse Regression Method for Group-Wise Feature Selection with False Discovery Rate ControlIEEE/ACM Transactions on Computational Biology and Bioinformatics10.1109/TCBB.2017.278010615:4(1066-1078)Online publication date: 1-Jul-2018
  • (2018)Group SLOPE – Adaptive Selection of Groups of PredictorsJournal of the American Statistical Association10.1080/01621459.2017.1411269114:525(419-433)Online publication date: 6-Aug-2018
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
BCB '15: Proceedings of the 6th ACM Conference on Bioinformatics, Computational Biology and Health Informatics
September 2015
683 pages
ISBN:9781450338530
DOI:10.1145/2808719
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 September 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. LASSO
  2. SLOPE
  3. false discovery rate
  4. group LASSO
  5. sparse regression

Qualifiers

  • Research-article

Funding Sources

Conference

BCB '15
Sponsor:

Acceptance Rates

BCB '15 Paper Acceptance Rate 48 of 141 submissions, 34%;
Overall Acceptance Rate 254 of 885 submissions, 29%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 23 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Estimating a signal subspace in the presence of impulsive noiseStatistics and Computing10.1007/s11222-024-10528-z35:1Online publication date: 25-Nov-2024
  • (2018)A Sparse Regression Method for Group-Wise Feature Selection with False Discovery Rate ControlIEEE/ACM Transactions on Computational Biology and Bioinformatics10.1109/TCBB.2017.278010615:4(1066-1078)Online publication date: 1-Jul-2018
  • (2018)Group SLOPE – Adaptive Selection of Groups of PredictorsJournal of the American Statistical Association10.1080/01621459.2017.1411269114:525(419-433)Online publication date: 6-Aug-2018
  • (2016)grpSLOPE: Group Sorted L1 Penalized EstimationCRAN: Contributed Packages10.32614/CRAN.package.grpSLOPEOnline publication date: 24-Apr-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media