The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Cannot reset network to smaller learning rate - Optimization parameter
olafansau55
Member Posts: 5 Learner I
Hello Guys
I'm working on a final project using rapid miner on COVID-19 data. I did the parameter optimization process. I entered a value for the learning rate in the range of 0.00 - 0.99 with a step count of 10 and for training_cycles in the range 1-200 for a step count of 10. Initially, there was no problem with that value, but when I changed the number of steps in training_cycles to 200, an error appeared like "Cannot reset the network to a smaller learning rate". I want to ask why this happened and how to solve it. If anyone can help out I would really appreciate it.
Thank you~
I'm working on a final project using rapid miner on COVID-19 data. I did the parameter optimization process. I entered a value for the learning rate in the range of 0.00 - 0.99 with a step count of 10 and for training_cycles in the range 1-200 for a step count of 10. Initially, there was no problem with that value, but when I changed the number of steps in training_cycles to 200, an error appeared like "Cannot reset the network to a smaller learning rate". I want to ask why this happened and how to solve it. If anyone can help out I would really appreciate it.
Thank you~
0
Best Answer
-
SabaRG Member Posts: 13 Contributor IIDear @olafansau55
Unfortunately, I don't know why the "Try" operator can not work with the deep learning operator, which I suggest you use to ignore this bug, and the ignore error option of optimization operator can not handle it! It is a bug with the deep learning (H2O) operator.
But, for your case, I suggest starting the learning rate from 0.1 or 0.2 and checking when it has no error. Your problem is with the learning rate.
Sincerely
#BugReport1
Answers