The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
"Optimization of SVM"
Hi,
I have been wasting a while on the SVM optimization (or maybe not). The question is more or less same as http://rapid-i.com/rapidforum/index.php/topic,1573.0.html, From Rapido ,
The idea is
1. generate data (binomial label)
2. transform the data
3. Parameter optimization with xvalidation, but AUC will be the main control parameter
4. Draw the best parameter's ROC and show AUC
but when i ran the process, the output is always negtive. (the machine judge everything to negtive), This really confused me... Thanks for advance
I have been wasting a while on the SVM optimization (or maybe not). The question is more or less same as http://rapid-i.com/rapidforum/index.php/topic,1573.0.html, From Rapido ,
The idea is
1. generate data (binomial label)
2. transform the data
3. Parameter optimization with xvalidation, but AUC will be the main control parameter
4. Draw the best parameter's ROC and show AUC
but when i ran the process, the output is always negtive. (the machine judge everything to negtive), This really confused me... Thanks for advance
<?xml version="1.0" encoding="UTF-8" standalone="no" ?>
- <process version="5.0">
- <context>
<input />
<output />
<macros />
</context>
- <operator activated="true" class="process" expanded="true" name="Root">
<description />
<parameter key="logverbosity" value="3" />
<parameter key="random_seed" value="2001" />
<parameter key="send_mail" value="1" />
<parameter key="process_duration_for_mail" value="30" />
<parameter key="encoding" value="SYSTEM" />
- <process expanded="true" height="440" width="560">
- <operator activated="true" class="generate_data" expanded="true" height="60" name="TrainData" width="90" x="45" y="30">
<parameter key="target_function" value="random classification" />
<parameter key="number_examples" value="100" />
<parameter key="number_of_attributes" value="5" />
<parameter key="attributes_lower_bound" value="-10.0" />
<parameter key="attributes_upper_bound" value="10.0" />
<parameter key="use_local_random_seed" value="false" />
<parameter key="local_random_seed" value="1992" />
<parameter key="datamanagement" value="0" />
</operator>
- <operator activated="true" class="normalize" expanded="true" height="94" name="Ztransformation" width="90" x="180" y="30">
<parameter key="return_preprocessing_model" value="true" />
<parameter key="create_view" value="false" />
<parameter key="attribute_filter_type" value="0" />
<parameter key="attribute" value="" />
<parameter key="use_except_expression" value="false" />
<parameter key="value_type" value="0" />
<parameter key="use_value_type_exception" value="false" />
<parameter key="except_value_type" value="2" />
<parameter key="block_type" value="0" />
<parameter key="use_block_type_exception" value="false" />
<parameter key="except_block_type" value="2" />
<parameter key="invert_selection" value="false" />
<parameter key="include_special_attributes" value="false" />
<parameter key="method" value="0" />
<parameter key="min" value="0.0" />
<parameter key="max" value="1.0" />
</operator>
- <operator activated="true" class="remember" expanded="true" height="60" name="IOStorer" width="90" x="313" y="30">
<parameter key="name" value="data" />
<parameter key="io_object" value="ExampleSet" />
<parameter key="store_which" value="1" />
<parameter key="remove_from_process" value="false" />
</operator>
- <operator activated="true" class="optimize_parameters_grid" expanded="true" height="148" name="ParameterOptimization" width="90" x="313" y="120">
- <list key="parameters">
<parameter key="Training.gamma" value="[0.0;1024;16;quadratic]" />
<parameter key="Training.C" value="[0.0;1000;10;quadratic]" />
</list>
- <process expanded="true" height="410" width="542">
- <operator activated="true" class="x_validation" expanded="true" height="112" name="Validation" width="90" x="112" y="30">
<parameter key="create_complete_model" value="false" />
<parameter key="average_performances_only" value="true" />
<parameter key="leave_one_out" value="false" />
<parameter key="number_of_validations" value="10" />
<parameter key="sampling_type" value="shuffled sampling" />
<parameter key="use_local_random_seed" value="false" />
<parameter key="local_random_seed" value="1992" />
- <process expanded="true" height="428" width="255">
- <operator activated="true" class="support_vector_machine_libsvm" expanded="true" height="76" name="Training" width="90" x="82" y="30">
<parameter key="svm_type" value="0" />
<parameter key="kernel_type" value="2" />
<parameter key="degree" value="5" />
<parameter key="gamma" value="1024.0" />
<parameter key="coef0" value="0.0" />
<parameter key="C" value="1000.0" />
<parameter key="nu" value="0.5" />
<parameter key="cache_size" value="80" />
<parameter key="epsilon" value="0.01" />
<parameter key="p" value="0.1" />
<list key="class_weights" />
<parameter key="shrinking" value="true" />
<parameter key="calculate_confidences" value="false" />
<parameter key="confidence_for_multiclass" value="true" />
</operator>
<connect from_port="training" to_op="Training" to_port="training set" />
<connect from_op="Training" from_port="model" to_port="model" />
<portSpacing port="source_training" spacing="0" />
<portSpacing port="sink_model" spacing="0" />
<portSpacing port="sink_through 1" spacing="0" />
</process>
- <process expanded="true" height="428" width="279">
- <operator activated="true" class="apply_model" expanded="true" height="76" name="Test" width="90" x="45" y="30">
<list key="application_parameters" />
<parameter key="create_view" value="false" />
</operator>
- <operator activated="true" class="performance_binominal_classification" expanded="true" height="76" name="Performance (2)" width="90" x="112" y="210">
<parameter key="main_criterion" value="AUC" />
<parameter key="accuracy" value="true" />
<parameter key="classification_error" value="false" />
<parameter key="kappa" value="false" />
<parameter key="AUC (optimistic)" value="false" />
<parameter key="AUC" value="true" />
<parameter key="AUC (pessimistic)" value="false" />
<parameter key="precision" value="false" />
<parameter key="recall" value="false" />
<parameter key="lift" value="false" />
<parameter key="fallout" value="false" />
<parameter key="f_measure" value="false" />
<parameter key="false_positive" value="false" />
<parameter key="false_negative" value="false" />
<parameter key="true_positive" value="false" />
<parameter key="true_negative" value="false" />
<parameter key="sensitivity" value="false" />
<parameter key="specificity" value="false" />
<parameter key="youden" value="false" />
<parameter key="positive_predictive_value" value="false" />
<parameter key="negative_predictive_value" value="false" />
<parameter key="psep" value="false" />
<parameter key="skip_undefined_labels" value="true" />
<parameter key="use_example_weights" value="true" />
</operator>
<connect from_port="model" to_op="Test" to_port="model" />
<connect from_port="test set" to_op="Test" to_port="unlabelled data" />
<connect from_op="Test" from_port="labelled data" to_op="Performance (2)" to_port="labelled data" />
<connect from_op="Performance (2)" from_port="performance" to_port="averagable 1" />
<portSpacing port="source_model" spacing="0" />
<portSpacing port="source_test set" spacing="0" />
<portSpacing port="source_through 1" spacing="0" />
<portSpacing port="sink_averagable 1" spacing="0" />
<portSpacing port="sink_averagable 2" spacing="0" />
</process>
</operator>
- <operator activated="true" class="log" expanded="true" height="112" name="Log" width="90" x="313" y="120">
<parameter key="filename" value="paraopt.log" />
- <list key="log">
<parameter key="C" value="operator.Training.parameter.C" />
<parameter key="Gamma" value="operator.Training.parameter.gamma" />
<parameter key="absolute" value="operator.BinominalClassificationPerformance.value.performance" />
<parameter key="auc" value="operator.Final Performance.value.AUC" />
</list>
<parameter key="sorting_type" value="0" />
<parameter key="sorting_k" value="100" />
<parameter key="persistent" value="false" />
</operator>
<connect from_port="input 1" to_op="Validation" to_port="training" />
<connect from_port="input 2" to_op="Log" to_port="through 3" />
<connect from_op="Validation" from_port="training" to_op="Log" to_port="through 1" />
<connect from_op="Validation" from_port="averagable 1" to_op="Log" to_port="through 2" />
<connect from_op="Log" from_port="through 1" to_port="result 1" />
<connect from_op="Log" from_port="through 2" to_port="performance" />
<connect from_op="Log" from_port="through 3" to_port="result 2" />
<portSpacing port="source_input 1" spacing="0" />
<portSpacing port="source_input 2" spacing="0" />
<portSpacing port="source_input 3" spacing="0" />
<portSpacing port="sink_performance" spacing="0" />
<portSpacing port="sink_result 1" spacing="0" />
<portSpacing port="sink_result 2" spacing="0" />
<portSpacing port="sink_result 3" spacing="0" />
<portSpacing port="sink_result 4" spacing="0" />
</process>
</operator>
<connect from_op="TrainData" from_port="output" to_op="Ztransformation" to_port="example set input" />
<connect from_op="Ztransformation" from_port="example set output" to_op="IOStorer" to_port="store" />
<connect from_op="Ztransformation" from_port="preprocessing model" to_op="ParameterOptimization" to_port="input 2" />
<connect from_op="IOStorer" from_port="stored" to_op="ParameterOptimization" to_port="input 1" />
<connect from_op="ParameterOptimization" from_port="performance" to_port="result 1" />
<connect from_op="ParameterOptimization" from_port="parameter" to_port="result 2" />
<connect from_op="ParameterOptimization" from_port="result 1" to_port="result 3" />
<connect from_op="ParameterOptimization" from_port="result 2" to_port="result 4" />
<connect from_op="ParameterOptimization" from_port="result 3" to_port="result 5" />
<portSpacing port="source_input 1" spacing="0" />
<portSpacing port="sink_result 1" spacing="0" />
<portSpacing port="sink_result 2" spacing="0" />
<portSpacing port="sink_result 3" spacing="0" />
<portSpacing port="sink_result 4" spacing="0" />
<portSpacing port="sink_result 5" spacing="0" />
<portSpacing port="sink_result 6" spacing="0" />
</process>
</operator>
</process>
Tagged:
0
Answers
please switch to the XML View of RapidMiner to see the process as text. Then copy it to the clip board and paste it here, since the Internet Explorer representation contains these "-" signs, that are not allowed in XML. I will then try to reproduce the behavior and check if there's a bug.
Greetings,
Sebastian
seems to me to deliver the best performance anyway? Or what do you want to the algorithm do, if you serve completely random data? learning the random seed? Possible, but more complex...
Greetings,
Sebastian
I can not upload the result pictures here, but the accuracy table is
accuracy 58.00% +- 16.61%
true negative true positive
pred. negative 58 42
pred. positive 0 0
class recal 100% 0.00%
I am really confused.. thanks
it's really simple: The data you are trying to learn from is completely random. There is no statistical dependency between the attribute values and the label. Without such an dependency, you cannot predict the label based on the attribute values, because they simply are completely independent from the label value. So the best thing you can do is predicting always the most frequent class.
And exactly this is done by the SVM.
To have more sexier results, change the parameter of the data generator to something different, that does not contain "random" in it's name.
Greetings,
Sebastian
what exactly is the problem with the current 5.0.006 version?
Greetings,
Sebastian
just quickly jumping in: I think it is exactly the strength of SVM to not fit the model to random data - this reduces the risk for overfitting. A neural net, for example, can easily be tuned to learn the random data (or "to memorize" it...) but this is exactly the reason why I prefer SVM over NN ;D
Just my 2c. Cheers,
Ingo