The estimation in the logistic regression (https://retostauffer.github.io/Rfoehnix/articles/logisticregression.html) is via the Newton method where the computed Hessian is given as $$ H = -X^TWX $$ where $X$ is the features, $W$ is a diagonal matrix, and the size of $X$ and $W$ are $n \times p$ and $n \times n$, respectively. And in the Newton method, we need to inverse the Hessian matrix $H \in R^{p \times p}$, assuming $H$ is invertible.
The question is given sufficient sample size n, how large of p that the implemented logistic regression in the software such as Python, R, MATLAB or others can handle? What are the limitations for a larger value of $p$?
I guess that the answer should be dependent on the sparsity of H. So there should be two cases: when H is very dense and when H has some structured sparsity.
Note that there is the regularized logistic regression model, but the regularization technique is not the focus of the above problem.