I'm running a Monte Carlo study and need to evaluate many linear regressions y = X b + u very efficiently. Specifically, I need to estimate a regression and compute the standard errors of the estimates many times. So speed is very important, while accuracy is not too much so. Evidently, Matlab's built-in functions such as lscov and regress take a fair amount of time to run. Hence, I wrote a little function to do these tasks myself.
However, as I need to calculate the standard errors, the function is still quite slow. I use inv(X' * X) to get the inverse, as it is used in the calculation of the standard errors (and it is evidently faster than X' * X \ eye(size(X, 2))). Is there a faster way of doing this, i.e. by some smart factorizations? I saw a suggestion to use a QR decomposition, and then use inv( R ) for calculating inv(X'* X) but this is even slower.
All the help is greatly appreciated!
No products are associated with this question.
You can use symbolic math to find the inverse of a symmetric 3x3 matrix:
syms a b c d e f real M = diag(inv([a b c; b d e; c e f]))
And then use that expression directly. I find that this is several times faster than calling INV.
X = rand(100,3); XX = X'*X; denom = (XX(9)*XX(2)^2 - 2*XX(2)*XX(3)*XX(6) + XX(5)*XX(3)^2 + XX(1)*XX(6)^2 - XX(1)*XX(5)*XX(9));
invXX = [(XX(6)^2 - XX(5)*XX(9))/denom; (XX(3)^2 - XX(1)*XX(9))/denom; (XX(2)^2 - XX(1)*XX(5))/denom]
You can confirm that this is the same as diag(inv(X'*X)).