The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Parameter Optimizer Problem
Hi There,
I am using an evolutionary parameter optimizer to determine the best parameters for my neural network. The neural network is embedded in a sliding window X-Validation with cumulative learning. After the process is finished I get the parameter set and a performance of 67,33 %. When I actually apply the parameters I only get 65,85 %, how is that possible? Isn´t that supposed to be the same?
Regards
Hagen
I am using an evolutionary parameter optimizer to determine the best parameters for my neural network. The neural network is embedded in a sliding window X-Validation with cumulative learning. After the process is finished I get the parameter set and a performance of 67,33 %. When I actually apply the parameters I only get 65,85 %, how is that possible? Isn´t that supposed to be the same?
Regards
Hagen
0
Answers
it is hard to tell anything without knowing your process (please see http://rapid-i.com/rapidforum/index.php/topic,4654.0.html ). However, yes, it is possible that you get different results, since the exerimental performance is always an approximation of the true performance. So depending on your setup, it is perfectly possible that you get (usually small) differences in the performance.
Best, Marius