Differential Leave-One-Out Cross-Validation for Feature Selection in Generalized Linear Dependence Models
Abstract
Supplementary Material
- Download
- 547.91 KB
References
Recommendations
Feature selection via dependence maximization
We introduce a framework for feature selection based on dependence maximization between the selected features and the labels of an estimation problem, using the Hilbert-Schmidt Independence Criterion. The key idea is that good features should be highly ...
Feature selection via dependence maximization
We introduce a framework for feature selection based on dependence maximization between the selected features and the labels of an estimation problem, using the Hilbert-Schmidt Independence Criterion. The key idea is that good features should be highly ...
Feature Selection with a Linear Dependence Measure
Features which are linearly dependent on other features do not contribute toward pattern classification by linear techniques. In order to detect the linearly dependent features, a measure of linear dependence is proposed. This measure is used as an aid ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
- Research
- Refereed limited
Funding Sources
Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 45Total Downloads
- Downloads (Last 12 months)8
- Downloads (Last 6 weeks)1
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign inFull Access
View options
View or Download as a PDF file.
PDFeReader
View online with eReader.
eReaderHTML Format
View this article in HTML Format.
HTML Format