The QR decomposition also works for matrices that are taller than wide. You can try examples at live.sympy.org: A=Matrix([[1,1,1],[1,2,4],[1,3,9],[1,4,16],[1,5,25]]) A Q,R=A.QRdecomposition() Q R This is handy for linear regression, because, if, like in HW25, part 1, you only care about a low-dimensional subspace of a high-dimensional space, then you can use QR just on the first part of your basis corresponding to your theoretical model. The columns of Q will still be orthonormal and have the correct span. R will still be square and invertible, but much smaller; R^-1 will still do the coordinate conversion you want: Below is a QR-implemented linear regression of five (x,y) data points to a linear model that you can try in SymPy (live.sympy.org). It finds that the best-fitting line for the data has y-intercept 6/5 and slope 43/50. one=[1]*5 x=[0,5,10,15,20] A=Matrix([one,x]).transpose() A Q,R=A.QRdecomposition() Q R y=Matrix([[1,6,9,15,18]]).transpose() y c=Q.transpose()*y #this computes the dot products for the projection c a=R.inv()*c a