The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Neural Net normalization"
chaosbringer
Member Posts: 21 Contributor II
Hi,
i am making a regression with a neural net. My label values have values between 0 and 1.
If i train a neural net and apply the model on some data, the prediction ranges between -1 and 1, also i have the normalization-option of the neural net operator activated.
Values smaller 0 are not valid for my mode, because 0 represents already the lowest acceptable model.
Is it ok to "cut" all values below 0 to 0?
Would it not be benfical, if this could be done somehow already during the training process, because the cut would affect the performance measurement during validation and thus affect the trained model.
I hope i could make my point clear... Thank you very much
i am making a regression with a neural net. My label values have values between 0 and 1.
If i train a neural net and apply the model on some data, the prediction ranges between -1 and 1, also i have the normalization-option of the neural net operator activated.
Values smaller 0 are not valid for my mode, because 0 represents already the lowest acceptable model.
Is it ok to "cut" all values below 0 to 0?
Would it not be benfical, if this could be done somehow already during the training process, because the cut would affect the performance measurement during validation and thus affect the trained model.
I hope i could make my point clear... Thank you very much
Tagged:
0
Answers
Denis