Search code examples
matlaboctavelogistic-regressionmulticlass-classification

Why at Logistic Regression(multi-class) accuracy is so small?


I try to solve a problem with 3 features and 6 classes(label). The training dataset is 700 rows * 3 columns. The features values are continious from 0-100. I use one-Vs-all method, but I do not know why the prediction accuracy is so small, just 24%. Could anyone tell me, please? Thank you! This is how I do the prediction:

function p = predictOneVsAll(all_theta, X)
m = size(X, 1);
num_labels = size(all_theta, 1);
% You need to return the following variables correctly 
p = zeros(size(X, 1), 1);
% Add ones to the X data matrix
X = [ones(m, 1) X];
[m, p] = max(sigmoid(X * all_theta'), [], 2);
end

And the One-Vs-all

% You need to return the following variables correctly 
all_theta = zeros(num_labels, n + 1);

% Add ones to the X data matrix
X = [ones(m, 1) X];

initial_theta = zeros(n+1, 1);
options = optimset('GradObj', 'on', 'MaxIter', 20);
for c = 1:num_labels,
 [theta] = ...
     fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
             initial_theta, options);
 all_theta(c,:) = theta';
end

Solution

  • In predictOneVsAll , you don't need to use sigmoid function. You need it only when the calculating cost. So correct code is,

    [m, p] = max((X * all_theta'), [], 2);
    

    In OneVsAll , loop should look like this

    for c = 1:num_labels
    
    all_theta(c,:) = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), initial_theta, options);
    
    endfor
    

    It is better if you ask these questions in andrew's ML course discussion. They would be more familiar with the code and the problem.