Automatic feature discovery

A comparative study between filter and wrapper feature selection techniques

More Info
expand_more

Abstract

The curse of dimensionality is a common challenge in machine learning, and feature selection techniques are commonly employed to address this issue by selecting a subset of relevant features. However, there is no consistently superior approach for choosing the most significant subset of features. We conducted a comprehensive analysis comparing filter and wrapper techniques to guide future work in selecting the most appropriate method based on specific circumstances. We quantified the performance of these techniques using a diverse collection of datasets. We utilised simple decision trees, linear machine learning algorithms, and support vector machines to assess the performance with varying percentages of features selected by the filter and wrapper techniques. The findings demonstrate that filter methods (Chi-Squared and ANOVA) perform better than wrapper methods (Forward Selection and Backward Elimination) regarding the classification accuracy, regression root mean squared error, and runtime.

Files