The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

Apply Model Problem?

srt19170srt19170 Member Posts: 44 Contributor II
edited November 2018 in Help
I've a curious behavior in RapidMiner.

I have a process that trains a model, saves the model, and then tests the model on some test data.  I have another process that is identically the same, except that it skips the training step and simply reads in the saved model before applying it to the test data.

The two processes produce different results, i.e., the performance on identical test data is different depending upon whether I'm using the model right out of the modeling operator or if I'm using the saved model.

The model I'm using is a Vote Model with three sub-models, a Neural Net, an SVM and a W-SMOreg, if that's relevant.

Any idea why the behavior should be different?  Is there a loss of accuracy when writing & then reading the model?  My differences seem fairly dramatic for that to be the cause, but I suppose it is possible.

Thanks for any help!

Scott Turner

Answers

  • srt19170srt19170 Member Posts: 44 Contributor II
    Never mind.  The train & test is failing to write out the model for some reason.  Annoyingly, it's failing silently.  There's nothing in the log file to indicate a problem.
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi Scott,
    that's really annoying. Can you show me the process? I would like to fix that behavior.

    Greetings,
    Sebastian
  • GrimaceGrimace Member Posts: 1 Learner III
    Hey, i have been getting this same thing occurring with my process.  I have a separate process which trains the neural network.  In the process I am experiencing to issue, i load two CSV files, which other than an extra flag attribute tacked on the end of each sample, they are binary equal.  I remove the flag  attribute before applying the model.  The same model is used to apply both example sets.  The results from each apply model do not match.  I can email process, models and data to assist.

    Any insight to whats causing this would be appreciated.
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, Member Posts: 2,531 Unicorn
    Hi,
    thanks, I will send you my mail by pm.

    Greetings,
      Sebastian
  • Danyo83Danyo83 Member Posts: 41 Contributor II
    Hi,

    I have exactly the same problem. Besides writing and applying the model what failed, I have also tried to rebuild the process by recalling the before saved weights. The attributes were then applied by a classifier. The performance differed to the performance of the initial Feature Selection Process performance on the testset altough the data was identical.

    Can you help me or can I send you my files?

    Thanks in advance

    Daniel

Sign In or Register to comment.