weka.classifiers.functions |
|
Java Source File Name | Type | Comment |
GaussianProcesses.java | Class |
Implements Gaussian Processes for regression without hyperparameter-tuning. |
IsotonicRegression.java | Class |
Learns an isotonic regression model. |
LeastMedSq.java | Class |
Implements a least median sqaured linear regression utilising the existing weka LinearRegression class to form predictions. |
LibSVM.java | Class |
A wrapper class for the libsvm tools (the libsvm classes, typically the jar file, need to be in the classpath to use this classifier).
LibSVM runs faster than SMO since it uses LibSVM to build the SVM classifier.
LibSVM allows users to experiment with One-class SVM, Regressing SVM, and nu-SVM supported by LibSVM tool. |
LinearRegression.java | Class |
Class for using linear regression for prediction. |
Logistic.java | Class |
Class for building and using a multinomial logistic regression model with a ridge estimator.
There are some modifications, however, compared to the paper of leCessie and van Houwelingen(1992):
If there are k classes for n instances with m attributes, the parameter matrix B to be calculated will be an m*(k-1) matrix.
The probability for class j with the exception of the last class is
Pj(Xi) = exp(XiBj)/((sum[j=1..(k-1)]exp(Xi*Bj))+1)
The last class has probability
1-(sum[j=1..(k-1)]Pj(Xi))
= 1/((sum[j=1..(k-1)]exp(Xi*Bj))+1)
The (negative) multinomial log-likelihood is thus:
L = -sum[i=1..n]{
sum[j=1..(k-1)](Yij * ln(Pj(Xi)))
+(1 - (sum[j=1..(k-1)]Yij))
ln(1 - sum[j=1..(k-1)]Pj(Xi))
} + ridge * (B^2)
In order to find the matrix B for which L is minimised, a Quasi-Newton Method is used to search for the optimized values of the m*(k-1) variables. |
MultilayerPerceptron.java | Class |
A Classifier that uses backpropagation to classify instances.
This network can be built by hand, created by an algorithm or both. |
PaceRegression.java | Class |
Class for building pace regression linear models and using them for prediction. |
PLSClassifier.java | Class |
A wrapper classifier for the PLSFilter, utilizing the PLSFilter's ability to perform predictions.
Valid options are:
-filter <filter specification>
The PLS filter to use. |
RBFNetwork.java | Class |
Class that implements a normalized Gaussian radial basisbasis function network.
It uses the k-means clustering algorithm to provide the basis functions and learns either a logistic regression (discrete class problems) or linear regression (numeric class problems) on top of that. |
SimpleLinearRegression.java | Class |
Learns a simple linear regression model. |
SimpleLogistic.java | Class |
Classifier for building linear logistic regression models. |
SMO.java | Class |
Implements John Platt's sequential minimal optimization algorithm for training a support vector classifier.
This implementation globally replaces all missing values and transforms nominal attributes into binary ones. |
SMOreg.java | Class |
Implements Alex Smola and Bernhard Scholkopf's sequential minimal optimization algorithm for training a support vector regression model. |
SVMreg.java | Class |
SVMreg implements the support vector machine for regression. |
VotedPerceptron.java | Class |
Implementation of the voted perceptron algorithm by Freund and Schapire. |
Winnow.java | Class |
Implements Winnow and Balanced Winnow algorithms by Littlestone.
For more information, see
N. |