Search code examples
javamachine-learningweka

Using Weights in Weka


In Weka you can set different weights to instances. The weight of each instance has an impact on a classifier (which takes weights into consideration). If I set the weight of an instance to 0 it means that I ignore this instance. If I set the weight to 2 for a given instance then it's like oversampling it (2 times the same instance).

My question is: what happens if I set negative values to the weight ? As I checked java does not crush (i've just used naive bayes). But what happens internally ? Is the instance ignored or something else is happening ? How does this have an impact on a classifier ? And if it has an impact, does this apply to all classifiers the same way or different classifiers behave differently ?


Solution

  • The impact of that feature will be negative on the classification. This means that instead of contributing to the classifier in assigning a given class, it will contribute to the classifier not learning that classification. Think of using a negative weight to a value in any operation. As commented by nekomatic, each classifier will use weights differently when they use it.