「Regression model」の版間の差分
Vaccipedia.admin (トーク | 投稿記録) |
Vaccipedia.admin (トーク | 投稿記録) |
||
68行目: | 68行目: | ||
†'Multivariable' can be rephrased as 'Multiple'; Multivariable is <font color="red">'''NOT equal to 'Multivariate'!!'''</font> | †'Multivariable' can be rephrased as 'Multiple'; Multivariable is <font color="red">'''NOT equal to 'Multivariate'!!'''</font> | ||
− | === | + | ===Conversion of binary logistic regression equation to outcome probability <math>p</math>=== |
− | + | Equation of binary logistic regression can be converted to outcome probablity <math>p</math> as, | |
:<math> | :<math> | ||
\begin{align} | \begin{align} | ||
− | + | \log Y = \log \left ( \frac{p}{1-p} \right ) & = a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots \\ | |
− | + | \Leftrightarrow \dfrac{p}{1-p} & = \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) = e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} \\ | |
+ | \Leftrightarrow p & = \dfrac { \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) }{ 1 + \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) } | ||
+ | = \dfrac { e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} }{ 1 + e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} } \\ | ||
\end{align} | \end{align} | ||
</math> | </math> | ||
− | + | ===How to convert coefficient of binary logistic regressoin to odds ratio=== | |
+ | When thinking about outcome probability <math>p</math> and the changed outcome probability <math>p\prime</math> by adding <math>1</math> to explanatory variable <math>X_1</math>, the following two equations are obtained, | ||
:<math> | :<math> | ||
− | \log \left ( \ | + | \begin{array}{lcl} |
+ | \log \left ( \dfrac{p}{1-p} \right ) & = & a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots \\ | ||
+ | \log \left ( \dfrac{p\prime}{1-p\prime} \right ) & = & a + b_1( {\color{red}X_1 + 1} ) + b_2X_2 + b_3X_3 + \cdots | ||
+ | \end{array} | ||
</math> | </math> | ||
− | Subtraction of | + | Subtraction of these two equations makes, |
:<math> | :<math> | ||
\begin{align} | \begin{align} | ||
− | \log \left ( \frac{p\prime}{1-p\prime} \right ) - \log \left ( \frac{p}{1-p} \right ) & = b_1(X_1 + 1) - b_1X_1 \\ | + | \log \left ( \frac{p\prime}{1-p\prime} \right ) - \log \left ( \frac{p}{1-p} \right ) & = b_1({\color{red}X_1 + 1}) - b_1X_1 \\ |
& = b_1 \\ | & = b_1 \\ | ||
\Leftrightarrow \frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) } & = e^{b_1} | \Leftrightarrow \frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) } & = e^{b_1} | ||
91行目: | 97行目: | ||
</math> | </math> | ||
− | |||
− | <math>\ | + | Because <math>\left ( \dfrac{p\prime}{1-p\prime} \right )</math> and <math>\left ( \dfrac{p}{1-p} \right )</math> are odds of <math>p\prime</math> and <math>p</math>, respectively, |
+ | |||
+ | <math>\frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) }</math> is the '''odds ratio''' of [the probability when <math>1</math> is added to <math>X_1</math>] to [the probability before adding]. | ||
+ | |||
+ | |||
+ | Thus, converted <math>b_1</math> to <math>\color{red}{e^{b_1}}</math> or <math>\color{red}{\exp (b_1)}</math> gives the '''odds ratio of outocome probabilities''' before and after variable <math>X_1</math> gains 1. | ||
==Generalized linear model== | ==Generalized linear model== |
2023年2月4日 (土) 18:32時点における版
目次
Classification of Regression models
Independent variable (exposure) | ||||
---|---|---|---|---|
Univariable (single variable) | Multivariable (multiple variables) | How to derive coefficients [math]\displaystyle{ b_i }[/math] | ||
Dependent variable (outcome) |
Continuous |
|
|
Least squares method |
Binary |
|
|
Maximum likelihood estimation method | |
Multinominal ≥ 3 |
|
|
Maximum likelihood estimation method | |
Ordinal |
|
|
Maximum likelihood estimation method | |
Rate ratio |
|
Maximum likelihood estimation method | ||
Survival time |
|
Maximum likelihood estimation method |
†'Multivariable' can be rephrased as 'Multiple'; Multivariable is NOT equal to 'Multivariate'!!
Conversion of binary logistic regression equation to outcome probability [math]\displaystyle{ p }[/math]
Equation of binary logistic regression can be converted to outcome probablity [math]\displaystyle{ p }[/math] as,
- [math]\displaystyle{ \begin{align} \log Y = \log \left ( \frac{p}{1-p} \right ) & = a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots \\ \Leftrightarrow \dfrac{p}{1-p} & = \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) = e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} \\ \Leftrightarrow p & = \dfrac { \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) }{ 1 + \exp (a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots) } = \dfrac { e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} }{ 1 + e^{(a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots)} } \\ \end{align} }[/math]
How to convert coefficient of binary logistic regressoin to odds ratio
When thinking about outcome probability [math]\displaystyle{ p }[/math] and the changed outcome probability [math]\displaystyle{ p\prime }[/math] by adding [math]\displaystyle{ 1 }[/math] to explanatory variable [math]\displaystyle{ X_1 }[/math], the following two equations are obtained,
- [math]\displaystyle{ \begin{array}{lcl} \log \left ( \dfrac{p}{1-p} \right ) & = & a + b_1X_1 + b_2X_2 + b_3X_3 + \cdots \\ \log \left ( \dfrac{p\prime}{1-p\prime} \right ) & = & a + b_1( {\color{red}X_1 + 1} ) + b_2X_2 + b_3X_3 + \cdots \end{array} }[/math]
Subtraction of these two equations makes,
- [math]\displaystyle{ \begin{align} \log \left ( \frac{p\prime}{1-p\prime} \right ) - \log \left ( \frac{p}{1-p} \right ) & = b_1({\color{red}X_1 + 1}) - b_1X_1 \\ & = b_1 \\ \Leftrightarrow \frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) } & = e^{b_1} \end{align} }[/math]
Because [math]\displaystyle{ \left ( \dfrac{p\prime}{1-p\prime} \right ) }[/math] and [math]\displaystyle{ \left ( \dfrac{p}{1-p} \right ) }[/math] are odds of [math]\displaystyle{ p\prime }[/math] and [math]\displaystyle{ p }[/math], respectively,
[math]\displaystyle{ \frac {\left ( \dfrac{p\prime}{1-p\prime} \right )}{ \left ( \dfrac{p}{1-p} \right ) } }[/math] is the odds ratio of [the probability when [math]\displaystyle{ 1 }[/math] is added to [math]\displaystyle{ X_1 }[/math]] to [the probability before adding].
Thus, converted [math]\displaystyle{ b_1 }[/math] to [math]\displaystyle{ \color{red}{e^{b_1}} }[/math] or [math]\displaystyle{ \color{red}{\exp (b_1)} }[/math] gives the odds ratio of outocome probabilities before and after variable [math]\displaystyle{ X_1 }[/math] gains 1.
Generalized linear model
Penalized multivariable logistic regression model
- Penalized Logistic Regression Essentials in R: Ridge, Lasso and Elastic Net
- 罰則付き・正則化回帰モデルについて(About penalized/regularized regression model)