I know the computational costs for the closed form of linear regression is $O(n^3)$, but I can't find a similar cost comparison to gradient descent.
There are some similar questions here with people "talk" about how gradient descent is more efficient and present some formulas that are not in the form of $O(\cdot)$ and do not include where they got their information.
So to reiterate, I am looking for the computational complexity for gradient descent in the form of $O(\cdot)$, something where $O(\cdot) < O(n^3)$.
It's possible I'm thinking about this wrong and there is no big $O$ comparison. If so please let me know. Thank you.