Publication Date

5-1-2023

Comments

Technical Report: UTEP-CS-23-21

Abstract

In machine learning -- and in data processing in general -- it is very important to select the proper number of features. If we select too few, we miss important information and do not get good results, but if we select too many, this will include many irrelevant ones that only bring noise and thus again worsen the results. The usual method of selecting the proper number of features is to add features one by one until the quality stops improving and starts deteriorating again. This method works, but it often takes too much time. In this paper, we propose faster -- even asymptotically optimal -- methods for solving the problem.

Share

COinS