Importance of feature selection stability in the classifier evaluation on high-dimensional genetic data
Classifiers trained on high-dimensional data, such as genetic datasets, often encounter situations where the number of features exceeds the number of objects. In these cases, classifiers typically rely on a small subset of features. For a robust algorithm, this subset should remain relatively stable...
Saved in:
| Main Authors: | Tomasz Łukaszuk, Jerzy Krawczuk |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
PeerJ Inc.
2024-11-01
|
| Series: | PeerJ |
| Subjects: | |
| Online Access: | https://peerj.com/articles/18405.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Stability of Feature Selection in Multi-Omics Data Analysis
by: Tomasz Łukaszuk, et al.
Published: (2024-11-01) -
A Feature Selection Approach Based on Archimedes’ Optimization Algorithm for Optimal Data Classification
by: Lahbib Khrissi, et al.
Published: (2025-01-01) -
Applying a New Feature Selection Method for Accurate Prediction of Earthquakes Using a Soft Voting Classifier
by: Oqbah Salim Atiyah, et al.
Published: (2024-12-01) -
Feature selection based on Mahalanobis distance for early Parkinson disease classification
by: Mustafa Noaman Kadhim, et al.
Published: (2025-01-01) -
A feature selection method based on instance learning and cooperative subset search
by: Xiaoyuan XU, et al.
Published: (2017-06-01)