Hitachi Vantara Pentaho Community Forums
Results 1 to 2 of 2

Thread: I'm New to Weka, How SMO works?

  1. #1
    Join Date
    Aug 2016

    Question I'm New to Weka, How SMO works?

    Hello Everyone.

    I'm NettoJM, I'm brazilian and new to data mining and and I don't have a lot of mathemathc and programing knowledg, so I'm having a bat time trying to understund how all it works

    I'm doing some tests on a little educational Data and trying to clarify how and how much the attributes do influence on the class attributes (IDEB)

    Here is a pic of the data (I already converted to arff files):

    Name:  sample.jpg
Views: 2056
Size:  18.5 KB

    This is just simulated data, we are still collecting real data, and the number of attributes is three or four times bigger, and a lot of instances too, we are planning on using only number attributes, who is the best algorithm we shold try to use?

    I got good accuracy with SMO (SVM) in some tests, but I can't understand the SMO output, you guys can help me on that too? I just want to understand how SMO works and his output, I can do a test with some provided dataset by Weka like the weather.numeric.arff, and I got results like that:

    Name:  sample 2.jpg
Views: 2164
Size:  22.4 KB

    Can someone explain to me what this output means in detail? Pardon any grammatical error, I am Brazilian so my english is not perfect.

    Thank you.

  2. #2
    Join Date
    Aug 2006


    By default (unless the kernel option is altered), SMO learns a linear model. In the case of classification with multiple input variables, this is actually a separating hyperplane - that is, it attempts to find a dividing plane where examples of one class fall on one side and examples of the other class fall on the other side of the plane. With support vector machines, this line (hyperplane) is something called the maximum margin hyperplane, which means it sits equally distant from something called the support vectors of each class :-) Anyhow, that is what the model is showing, as expressed in the terms of coefficients (weights) applied to the original input variables (albeit normalised to the 0-1 range).

    You could also try another popular linear classifier - logistic regression (weka.classifiers.functions.Logistic).


Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.