The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Neural Net (SOLVED)
Hi,
I have question concerning the Neural Net in Rapidminer. Which weight initialization technique is used? Usually one has a certain randomness (e.g. “Hard” Range Randomization or Nguyen-Widrow Initialization) so that the results aren't replicable. How does it work in RM, cause here one gets the same results in every trial. I am asking since I need a method which can do so, cause I need replicable results. Which technique is the most recommendable with a complex error surface.
Thanks a lot in advance
Danyo
I have question concerning the Neural Net in Rapidminer. Which weight initialization technique is used? Usually one has a certain randomness (e.g. “Hard” Range Randomization or Nguyen-Widrow Initialization) so that the results aren't replicable. How does it work in RM, cause here one gets the same results in every trial. I am asking since I need a method which can do so, cause I need replicable results. Which technique is the most recommendable with a complex error surface.
Thanks a lot in advance
Danyo
Tagged:
0
Answers
the Neural Net in RapidMiner is also randomized, but the same random numbers are used in each try to make it possible to create replicable results.
The set of random numbers used is defined by the random seed, which by default is initialized once per process execution. If you have only one Neural Net in the process (or more generally, only one operator that uses random numbers), it will always create the same results. If you have two, they will use different random numbers, which are however the same if you execute the process more than once.
To define the random seed for each Neural Net on its own, you can activate the option "use local random seed" in the Neural Net operator.
Best regards,
Marius
thanks for the reply. But what is the underlying system of the "random seed" and is there a distribution assumption? The Ngyuen-Widrom Initialization si also based on randomness, but it still follows kind of a data distribution.
Regards,
Daniel
unfortunately I can't tell you off-hand how exactly the Neural Net is implemented in RapidMiner. However, since RapidMiner is open source, you are free to have a look at the source code of the Neural Net. That's also what I would have to do to find out the implementation details.
The random generator configured by the random seed simply delivers uniformly distributed values between 0 and 1, but I don't know for sure how the Neural Net uses them.
Best regards,
Marius