Skip to content
Rain Hu's Workspace
Go back

[AI] 3-5. 邏輯斯迴歸(logistic regression)

Rain Hu

貝式定理

高斯分布

fμ,σ(x)=12πσ2exp{(xμ)22σ2}f_{\mu,\sigma}(x)=\frac{1}{\sqrt{2\pi\sigma^2}}exp\bigg\lbrace-\frac{(x-\mu)^2}{2\sigma^2}\bigg\rbrace fμ,Σ(x)=1(2π)D/21Σ1/2exp{12(xμ)TΣ1(xμ)}f_{\mu,\Sigma}(x)=\frac{1}{(2\pi)^{D/2}}\frac{1}{|\Sigma|^{1/2}}exp\bigg\lbrace-\frac{1}{2}(x-\mu)^T\Sigma^{-1}(x-\mu)\bigg\rbrace x=[x1x2xn]x = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix} μ=[μ1μ2μn]\mu = \begin{bmatrix} \mu_1 \\ \mu_2 \\ \vdots \\ \mu_n \end{bmatrix} Σ=[σ11σ12σ1nσ21σ22σ2nσn1σn2σnn]\Sigma = \begin{bmatrix} \sigma_{11} & \sigma_{12} & \cdots & \sigma_{1n} \\ \sigma_{21} & \sigma_{22} & \cdots & \sigma_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ \sigma_{n1} & \sigma_{n2} & \cdots & \sigma_{nn} \end{bmatrix}

二元分類

二元分類 v.s. 線性迴歸

Logistic RegressionLinear Regressionfunctionfw,b(x)=σ(iwixi+b)fw,b(x)=iwixi+bloss functionL(f)=iC(f(xi),y^i)L(f)=12i(f(xi)y^i)2updatewi=wiηi(f(xi)y^i)xiwi=wiηi(f(xi)y^i)xi\begin{array}{c|c|c} &\text{Logistic Regression}&\text{Linear Regression}\\ \hline \text{function}&f_{w,b}(x)=\sigma(\sum_i w_ix_i+b)&f_{w,b}(x)=\sum_i w_ix_i+b\\ \hline \text{loss function}&L(f)=\sum_iC(f(x_i),\hat{y}_i)&L(f)=\frac{1}{2}\sum_i(f(x_i)-\hat{y}_i)^2\\ \hline \text{update}&w_i=w_i-\eta\sum_i (f(x_i)-\hat{y}_i)x_i&w_i=w_i-\eta\sum_i (f(x_i)-\hat{y}_i)x_i \end{array} L(f)wi=2(fw,b(x)y^fw,b(x)(1fw,b(x)))xi\frac{\partial L(f)}{\partial w_i}=2(f_{w,b}(x)-\hat{y}f_{w,b}(x)(1-f_{w,b}(x)))x_i

Discriminative v.s. Generative

z=(μAμB)TΣ1x12(μA)T(ΣA)1μA+12(μB)T(ΣB)1μB+lnNANBz=(\mu_A-\mu_B)^T\Sigma^{-1}\textcolor{red}{x}-\frac{1}{2}(\mu_A)^T(\Sigma_A)^{-1}\mu_A+\frac{1}{2}(\mu_B)^T(\Sigma_B)^{-1}\mu_B+\ln\frac{N_A}{N_B}

符合了 z=wTx+bz=w^Tx+b 的 pattern,那能否直接代入 wx+b 來求最佳的 wwbb 呢?答案是可以的,這種方法就稱為 Discriminative 的方法。

Logistic Regression 的限制


Share this post on:

Previous
[AI] 3-6. 實作線性分類器
Next
[AI] 3-4. 線性迴歸