[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
KR2024Proceedings of the 21st International Conference on Principles of Knowledge Representation and ReasoningProceedings of the 21st International Conference on Principles of Knowledge Representation and Reasoning

Hanoi, Vietnam. November 2-8, 2024.

Edited by

ISSN: 2334-1033
ISBN: 978-1-956792-05-8

Sponsored by
Published by

Copyright © 2024 International Joint Conferences on Artificial Intelligence Organization

LAD-based Feature Selection for Optimal Decision Trees and Other Classifiers

  1. David Ing(CRIL - CNRS, Artois University, Lens)
  2. Said Jabbour(CRIL - CNRS, Artois University, Lens)
  3. Lakhdar Sais(CRIL - CNRS, Artois University, Lens)
  4. Fabien Delorme(CRIL - CNRS, Artois University, Lens)

Keywords

  1. ML for Reasoning-General

Abstract

The curse of dimensionality presents a significant challenge in data mining, pattern recognition, computer vision, and machine learning applications. Feature selection is a primary approach to address this challenge. It aims to eliminate irrelevant and redundant features while preserving the relevant ones to reduce computation time, improve prediction performance, and enhance the understanding of data. In this paper, we introduce a new feature selection (FS) technique based on the Logical Analysis of Data (LAD), a pattern learning framework that combines optimization, Boolean functions, and combinatorial theory. One of its main objectives is to generate minimal support sets of features (subsets of features) that discriminate between different groups of data. To generate such subsets, we first reduce the complexity of the LAD optimization task by transforming it into the problem of enumerating minimal hitting sets in a hypergraph, for which efficient implementations exist. Those feature subsets are then ranked based on a scoring method before selecting the highest quality one. Moreover, we explore the relationship between optimal Decision Trees (DTs) and LAD-based FS, introducing new optimality criteria, namely DTs involving a minimum number of features. Finally, we conduct comparative evaluations of LAD-based approach against several state-of-the-art (SOTA) FS methods on benchmark datasets, including two-class binary datasets and numerical datasets with two and multiple classes. Experiments reveal that our approach is competitive with SOTA methods, selecting high-quality feature subsets that maintain or enhance the performance of DTs and other classifiers like SVM, KNN, and Naive Bayes.