The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

RapidMiner Neural Network

HarumpelHarumpel Member Posts: 10 Contributor II
edited November 2018 in Help
Hello,

which learning enhancements does the Rapid Miner Neural Network operator implement to speed-up learning and improve generalization?

Kind Regards
Theo

Answers

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi Theo,
    what do you mean by learning enhancements? Sorry, but I'm not too familiar with the slang around Neural Networks.

    Greetings,
      Sebastian
  • HarumpelHarumpel Member Posts: 10 Contributor II
    Hi Sebastian,

    afaik in Neural Networks the basic learning algorithm is backpropagation of the squared error + adjusting the weights according to dE/dw. This so called "naive backpropagation" is quite inefficient, since it takes forever to converge to the optimal configuration in many cases.

    Now my question is how exactly (backpropagation) learning is implemented in RapidMiner. I guess I have to take a look into the Joona documentation...

    Kind Regards
    Theo
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi Theo,
    either that or I could ask Ingo who implemented the operator. But currently he's on the CeBit, so that's not an option for now.

    Greetings,
      Sebastian
Sign In or Register to comment.