The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Sequential floating search
Hey,
pure greedy approaches for feature selection have the disadvantage that once a feature is chosen or eliminated this decision cannot be revoked even if the learner would be better. Therefore one could use floating methods. For example one let choose the best random features (or GA based search) after 100 iterations. After that one combines sequential forward selection and backward elimination in a row and multiple times until there is no further improvement.
Would be a great tool for RM.
Daniel
pure greedy approaches for feature selection have the disadvantage that once a feature is chosen or eliminated this decision cannot be revoked even if the learner would be better. Therefore one could use floating methods. For example one let choose the best random features (or GA based search) after 100 iterations. After that one combines sequential forward selection and backward elimination in a row and multiple times until there is no further improvement.
Would be a great tool for RM.
Daniel
0
Answers
Also there is entire feature selection extension? And P-Rules extension with different selection operators?