Search code examples
pythonscikit-learnlogistic-regression

Why is the result different for using logistic regression model generated by sklearn and sigmoid funcion


Here is the detail:

Let's say I have a model with following coefficients and interception:

# Coef
w1 = 0.018056353337078567
w2 = 0.000646433629145055
w3 = 0.11595942738379618
w4 = 0.021109268199259484
w5 = 0.05204164353607967
w6 = -0.11317012710348132
w7 = -0.05215587577473489
w8 = -2.0132721508721287
intercept = -2.0132721508721287

and my sample:

# Sample
x1 = 10
x2 = 70.05
x3 = 15
x4 = 24
x5 = 1
x6 = 2
x7 = 17
x8 = 1

When I am using sklearn logistic regression, after loading the model and I called model.predict_proba, I got

[[0.21018339 0.78981661]]

But I'm putting these params into a sigmoid function, I got

0.0681390750219555

Clearly 0.78981661 != 0.0681390750219555, but I wonder why would that happen.

Here is the code for sigmoid function:

import numpy as np

# Coef
w1 = 0.018056353337078567
w2 = 0.000646433629145055
w3 = 0.11595942738379618
w4 = 0.021109268199259484
w5 = 0.05204164353607967
w6 = -0.11317012710348132
w7 = -0.05215587577473489
w8 = -2.0132721508721287
intercept = -2.0132721508721287
# Sample
x1 = 10
x2 = 70.05
x3 = 15
x4 = 24
x5 = 1
x6 = 2
x7 = 17
x8 = 1

z = \
    w1 * x1 + \
    w2 * x2 + \
    w3 * x3 + \
    w4 * x4 + \
    w5 * x5 + \
    w6 * x6 + \
    w7 * x7 + \
    w8 * x8 + \
    + intercept


y = 1/(1+np.exp(-z))

print(y)

Solution

  • It is the same internally. I made a mistake on the calculation for the first one.