[[toc]]

CHEAT SHEET OF VECTORIZED REGRESSION FORMULAS WITH GRADIENT DESCENT

DEFINITIONS

  • X m observations training data
  • X m labels of training data
  • X n parameters for training data
  • X n parameter gradients for training data
  • X learning rate
  • X regularization term

##LINEAR REGRESSION WITH GRADIENT DESCENT

NOTES

  • You can perform Polynomial regression as well with x. Im not so sure about - in some cases it still might be convex. COST FUNCTION

GRADIENT

GRADIENT DESCENT

REGULARISATION

  • can be separately added

LOGISTIC REGRESSION

SIGMOID

COST FUNCTION

**
**

GRADIENT

Same as in linear regression

GRADIENT DESCENT

Same as in linear regression

REGULARISATION

Same as in linear regression

CALCULATING THETA ANALYTICALLY - NORMAL EQUATION

NOTES

  • Can not be used for logistic regression. Logistic regression does not form a linear equation.

METHOD

  1. Set partial derivative matrix of cost function to 0
  1. Solve the linear equation for through this derived equation.

REQULARIZATION

\begin{align}& \theta = \left( X^TX + \lambda \cdot L \right)^{-1} X^Ty \newline& \text{where}\ \ L = \begin{bmatrix} 0 & & & & \newline & 1 & & & \newline & & 1 & & \newline & & & \ddots & \newline & & & & 1 \newline\end{bmatrix}\end{align}