Regression model
Classification of Regression models
Independent variable (exposure) | ||||
---|---|---|---|---|
Univariable (single variable) | Multivariable (multiple variables) | How to derive coefficients [math]\displaystyle{ b_i }[/math] | ||
Dependent variable (outcome) |
Continuous |
|
|
Least squares method |
Binary |
|
|
Maximum likelihood estimation method | |
Multinominal ≥ 3 |
|
|
Maximum likelihood estimation method | |
Ordinal |
|
|
Maximum likelihood estimation method | |
Rate ratio |
|
Maximum likelihood estimation method | ||
Survival time |
|
Maximum likelihood estimation method |
†'Multivariable' can be rephrased as 'Multiple'; Multivariable is NOT equal to 'Multivariate'!!
Conversion of binary logistic regression equation to outcome probability [math]\displaystyle{ p }[/math]
Equation of binary logistic regression can be converted to outcome probablity [math]\displaystyle{ p }[/math] as,
- [math]\displaystyle{ \begin{align} \log Y = \log \left ( \frac{p}{1-p} \right ) & = a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots \\ \Leftrightarrow \dfrac{p}{1-p} & = \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) = e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} \\ \Leftrightarrow p & = \dfrac { \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) }{ 1 + \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) } = \dfrac { e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} }{ 1 + e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} } \\ \end{align} }[/math]
How to convert coefficient of binary logistic regressoin to odds ratio
When thinking about outcome probability [math]\displaystyle{ p }[/math] and the changed outcome probability [math]\displaystyle{ p\prime }[/math] by adding [math]\displaystyle{ 1 }[/math] to explanatory variable [math]\displaystyle{ X_1 }[/math], the following two equations are obtained,
- [math]\displaystyle{ \begin{array}{lcl} \log \left ( \dfrac{p}{1-p} \right ) & = & a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots \\ \log \left ( \dfrac{p\prime}{1-p\prime} \right ) & = & a + b_1( {\color{red}X_1 + 1} ) + b_2X_2 + b_3X_3 + \cdots \end{array} }[/math]
Subtraction of these two equations makes,
- [math]\displaystyle{ \begin{align} \log \left ( \frac{p\prime}{1-p\prime} \right ) - \log \left ( \frac{p}{1-p} \right ) & = b_1({\color{red}X_1 + 1}) - b_1X_1 \\ & = b_1 \\ \Leftrightarrow \frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) } & = e^{b_1} \end{align} }[/math]
Because [math]\displaystyle{ \left ( \dfrac{p\prime}{1-p\prime} \right ) }[/math] and [math]\displaystyle{ \left ( \dfrac{p}{1-p} \right ) }[/math] are odds of [math]\displaystyle{ p\prime }[/math] and [math]\displaystyle{ p }[/math], respectively,
[math]\displaystyle{ \frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) } }[/math] is the odds ratio of [the probability when [math]\displaystyle{ 1 }[/math] is added to [math]\displaystyle{ X_1 }[/math]] to [the probability before adding].
Thus, converted [math]\displaystyle{ b_1 }[/math] to [math]\displaystyle{ \color{red}{e^{b_1}} }[/math] or [math]\displaystyle{ \color{red}{\exp (b_1)} }[/math] gives the odds ratio of outocome probabilities before and after variable [math]\displaystyle{ X_1 }[/math] gains 1.