The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

Confidence or Prediction Intervals

npapan69npapan69 Member Posts: 17 Maven
edited July 2019 in Help
Dear All,
When reporting a performance metric (ie AUC) of a model that was trained on a single data set and tested on a hold-out set, what is the proper way to assess its variance? Calculating the confidence intervals of the AUC or the prediction intervals?
Many thanks
Nikos
Tagged:

Answers

  • varunm1varunm1 Member Posts: 1,207 Unicorn
    Hello @npapan69

    This is a bit tricky as they both are centered around the same value, but a prediction interval is wider than the confidence interval. So in case of error reduction or a better accurate prediction you can go with prediction intervals.
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

  • npapan69npapan69 Member Posts: 17 Maven
    Thanks Varunm1 for your answer. Is the prediction interval provided by any of the rapidminer operators?
  • varunm1varunm1 Member Posts: 1,207 Unicorn
    Hello @npapan69

    I am checking that, and I see no option in RM for that. I think it is a bit complicated to calculate.

    @mschmitz or @IngoRM any comments on this.

    Thanks
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

Sign In or Register to comment.