Menu
Assume that we have the following dataset.

Using lm()  to calculate the regression coefficient is very easy.

Then we plug the coefficients in the formula.
$$\hat{y}=\beta_{0}+\beta_{1}x_{1}+\beta_{2}x_{2}+…+\beta_{p}x_{p}$$
But how are \(\beta_{0}\), \(\beta_{f}\), and \(\beta_{w}\) calculated? \(\bar{x}_{p}\), \(\rho\), and \(\sigma_{p}\) are the building blocks.

Now we are ready to calculate the intercepts.
 $$\beta_{f}=\frac{\rho_{y,1}-\rho_{y,2}\cdot\rho_{1,2}}{1-\rho_{1,2}^2}=-0.253$$

$$\beta_{w}=\frac{\rho_{y,2}-\rho_{y,1}\cdot\rho_{1,2}}{1-\rho_{1,2}^2}=-0.389$$
Then we can calculate \(\beta_{0}\)
$$\beta_{0}=\bar{x}-(\beta_{f}\cdot\bar{x}_{f})-(\beta_{w}\cdot\bar{x}_{w})=202.21$$
Now we are ready to predict.
$$\hat{y}=202.21+-0.253x_{1}-0.389x_{2}$$
TL;DR Manually fitting multiple regression is bearable. But if predictors are more than two. No need to reinvent the wheel, let’s just use lm() .  🙂