The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

Which values can i use for the optimize Parameters Operator in a deep Learning Prediction Model

JunoSitalJunoSital Member Posts: 11 Learner II
edited December 2019 in Help
Hello,

I am currently implementing a performance Test with the gradient boosted tree und the deep learning algorithm for a prediction model.
For the GBT i used the number of trees, maximal depth and the learning rate for the optimize parameters operator. But I have no idea with witch parameters I can try to optimize my deep learning model.

Thanks for your help an a happy new year  :)
Alex

Best Answer

  • varunm1varunm1 Member Posts: 1,207 Unicorn
    Solution Accepted
    Hello @JunoSital

    Sorry for delayed response. The option is available in "Expert Paramters"  as "input drop out ratio". One thing I am not sure is why DL operator is forcing you to set a drop out ratio as it is not mandatory. Try setting input drop out ratios as 0.2 for each layer here.

    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

Answers

  • varunm1varunm1 Member Posts: 1,207 Unicorn
    Hello @JunoSital

    As there are many hyperparameters that can be tuned in a DL, two important parameters are learning rate and epochs You can tune these first. Then you can change the number of hidden nodes and hidden layers, unfortunately, optimize parameters don't support tuning these two and you need to do this manually. Start with a simple network and then grow your network to see how its performing. 
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

  • JunoSitalJunoSital Member Posts: 11 Learner II
    Hello @varunm1,

    thank you very much for your fast reply.
    I'll try this out right now and give you a feedback.

    I wish you a happy new year.
    Alex
  • JunoSitalJunoSital Member Posts: 11 Learner II
    Hello @varunm1,

    i set the following parameters:

    learning rate: 0.01 to 0.2 in 19 Steps (optimize parameters)

    epochs: 10 to 1000 in 30 Steps (optimize parameters)

    hidden layer sizes: 50/50 (manually)

    After 20 minutes I received the following error:

    Model training error (H2O).
    Error while training the H2O model: Illegal argument(s) for DeepLearning model: ERRR on field: _hidden_dropout_ratios: Must have 2 hidden layer dropout ratios.

    Where can I set these hidden dropout ratios?

    Thank you for your help!
    Alex
  • varunm1varunm1 Member Posts: 1,207 Unicorn
    Hello @JunoSital

    I am away from my computer. If you have two layers then you need to set drop out for each layer in "hidden dropout ratios option" you can select same or different dropout ratios for each layer.

    I will check once I reach my PC. In the mean time you can try to set two rations in  hidden dropout ratios option in deep learning operator model parameters.
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

  • JunoSitalJunoSital Member Posts: 11 Learner II
    Hello @varunm1,

    thanks again for your help.
    In Parameter section from the deep learning algorithm i can't find the parameter "hidden dropout ratios".
    Here are the available settings:



    The "hidden dropout ratios" parameter is also not found in the expert parameters.

    Kind regards
    Alex
Sign In or Register to comment.