View Full Version : SVM Support Vector Results rather than attribute weights

12-19-2007, 02:14 AM

I am currently trying to generate data using SVM. However, the function which i retrieve gives me attributes weights rather than each attribute single support vector.

Please advise how i can change to this settings in Weka. I tried different kernels yet still the same.

Another question, given that I have unbalanced data set of two different class label, can anyone help how could I adjust the weight of these two classes using SVM?
For example, using weather .arff i have 80% of my results are biased to Play=Yes and the rest is No. So how could I adjust these weight.

Please help. Thanks.


Just edited:- for the cost, i assume you meant by misclassfying the minority class as:-

0 5 Yes
5 0 No

where my No class data is small. However, when after I have resized them, in the Classifer window while exectuing, the cost matrix i saw is

0 5
1 0

What is the reason for this scenario to appear?

Please advise. Thanks.


12-19-2007, 04:51 AM
You would need to alter the code to output the actual instances that are support vectors. As for you unbalanced class problem - you could, assuming you want balance the classes, use cost sensitive learning and increase the cost of misclassifying the minority class (use weka.classifiers.meta.CostSensitiveClassifier). Another approach would be to resample the data and bias the sample towards a uniform class distribution.


12-19-2007, 07:45 PM
Hi Mark,

Thanks for the reply.

Do you mean we do not have any settings in Weka for the support vector of each instances(attributes)? I was under assumption that there would be an option for us to set the display the support vector instead.
Do you have any hints in the Classifier.SMO to change these?

And regarding the cost, I assume you meant in MoreOptions-> Cost-sensitive evaluation.If it is so, I would like to know from the changes we made in Cost Matrix, how do we intepret the table of
which is the default ones.

Hope to hear from you soon. Thanks once again for your help.