The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
[SOLVED] attribute selection
I am doing attribute selection by backward elimination as is describeded in some Rapid Miner tutorials e.g. on youtube. It works fine, but I want to know if is not possible to improve it even more by some work around in my process.
Question: I would like to know if it is not better to have nested some optimization of learner's parameters in this procedure - is to correct to set learner inside (I am using LibSVM) just to some default values (or values which I think could be quite good)?
I mean the optimization of the learner which is nested in Validation (and Validation is nested in Backward elimination).
Thanks,
Milan
Question: I would like to know if it is not better to have nested some optimization of learner's parameters in this procedure - is to correct to set learner inside (I am using LibSVM) just to some default values (or values which I think could be quite good)?
I mean the optimization of the learner which is nested in Validation (and Validation is nested in Backward elimination).
Thanks,
Milan
0
Answers
of course it is valid to optimize the parameters of the learner inside the feature selection. That way you get the most out of your data. Instead of setting some manually chosen "good" parameters you might even consider to place a complete parameter optimization operator inside the feature selection, since the optimal parameters may change for different feature sets. Of course that will drastically increase the running time of your process.
Btw, you could also try the Forward selection - it is usually faster, since it starts with an empty feature set.
Best regards,
Marius