# Linear regression and matrix division in Julia

• A+
Category：Languages

The well known formula for OLS is `(X'X)^(-1)X'y` where `X` is `nxK` and `y` is `nx1`.

One way to implement this in Julia is `(X'*X)/X'*y`.

But I found that `X/y` gives the almost same output up to the tiny computational error.

Do they always compute the same thing (as long as `n>k`)? If so, which one should I use?

When `X` is square, there is a unique solution and LU-factorization (with pivoting) is a numerically-stable way to calculate this. That is the algorithm that backslash uses in this case.

When `X` is not square, which is the case in most regression problems, then there is no unique solution but there is a unique least square solution. The QR factorization method for solving `Xβ = y` is a numerically stable method for generating the least square solution, and in this case `X/y` uses the QR-factorization and thus gives the OLS solution.

Notice the words numerically stable. While `(X'*X)/X'*y` will theoretically always give the same result as backslash, in practice backslash (with the correct factorization choice) will be more precise. This is because the factorization algorithms are implemented to be numerically stable. Because of the change for floating point errors to accumulate when doing `(X'*X)/X'*y`, it's not recommended that you use this form for any real numerical work.

Instead, `(X'*X)/X'*y` is somewhat equivalent to an SVD factorization which is the most nuemrically stable algorithm, but also the most expensive (in fact, it's basically writing out the Moore-Penrose pseudoinverse which is how an SVD factorization is used to solve a linear system). To directly do an SVD factorization using a pivoted SVD, do `svdfact(X) / y` on v0.6 or `svd(X) / y` on v0.7. Doing this directly is more stable than `(X'*X)/X'*y`. Note that `qrfact(X) / y` or `qr(X) / y` (v0.7) is for QR. See the factorizations portion of the documentation for more details on all of the choices.