The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Out of Memory Error"
Hi,
My data set is having around 40000 attributes with 1000 positive & 1000 negative examples.
I am using the following operators:-
Root
TextInput (TFIDF vector creation)
StringTokenizer
TokenLengthFilter
W-GainRatioAttributeEval
AttributeWeightSelection
XValidation
LibSVMLearner
OperatorChain
ModelApplier
Performance
I am using the Rapidminer GUI. My RAM is of 2GB. OS - Windows Vista
I am getting a heap space error while on operator W-GainRatioAttributeEval.
Is there any way to get rid of this problem?
My data set is having around 40000 attributes with 1000 positive & 1000 negative examples.
I am using the following operators:-
Root
TextInput (TFIDF vector creation)
StringTokenizer
TokenLengthFilter
W-GainRatioAttributeEval
AttributeWeightSelection
XValidation
LibSVMLearner
OperatorChain
ModelApplier
Performance
I am using the Rapidminer GUI. My RAM is of 2GB. OS - Windows Vista
I am getting a heap space error while on operator W-GainRatioAttributeEval.
Is there any way to get rid of this problem?
Tagged:
0
Answers
yes of course it is. In fact there are a bunch of solutions. The simplest is: Buy more ram and switch to 64 bit. But I think you are aware of this already
So, two ways out:
- Exchange the WEKA operator by the RapidMiner equivalent "InfoGainRationWeighting". This should save you some memory needed to double the data.
- Did you check the memory monitor? How much RAM your RapidMiner installation does actually consume? Dependent on the way, you are invoking RapidMiner, there might be some reasons, why not enough RAM is reserved by Java.
Greetings,
Sebastian