# Question #c4443

Jul 22, 2016

$\beta = {X}^{T} Y {\left({X}^{T} X\right)}^{-} 1$
Use MLE or Sqaured Error Loss function

#### Explanation:

We can approach this problem in two ways, I will do so using MLE
We know the following
$E \left(\epsilon\right) = 0$ thus we can remove it becuase we expect this to be 0 regardless of the variance.

$f \left(y | x \beta\right) = \frac{1}{\sqrt{2 \pi {\sigma}^{2}}} {e}^{- {\left(y - x \beta\right)}^{2} / \left(2 {\sigma}^{2}\right)}$ if we assume a normal distribution

now using maximum likelihood we find the best estimate for $\beta$ if we know what $y$ and $x$ are. In most cases we know what these are so

$l \left(\beta | x , y\right) = \frac{1}{\sqrt{2 \pi {\sigma}^{2}}} {e}^{- {\left(y - x \beta\right)}^{2} / \left(2 {\sigma}^{2}\right)}$

now we take the log to simplify thus
$\log \left(l \left(\beta | x , y\right)\right) = \log \left(\frac{1}{\sqrt{2 \pi {\sigma}^{2}}}\right) + \left(- {\left(y - x \beta\right)}^{2} / \left(2 {\sigma}^{2}\right)\right)$

then we take the derivative set to 0 and solve
$= - \frac{x \left(y - x \beta\right)}{{\sigma}^{2}} = 0$
$= \frac{{x}^{2} \beta}{{\sigma}^{2}} = \frac{x y}{{\sigma}^{2}}$

$= {x}^{2} \beta = x y$
$= \beta = \frac{x y}{x} ^ 2$

This agrees algebraically if we are interested in minimizing the residual difference, usually using sum of squared loss function eg
${f}_{\beta} = {\left(y - x \beta\right)}^{2}$
$f p r i m {e}_{\beta} = - 2 x \left(y - x \beta\right) = 0$
$= 2 {x}^{2} \beta = 2 x y$
$= \beta = \frac{x y}{x} ^ 2$