You are asking about the class of algorithms or models known as (recursive|online|incremental|streaming) (estimation|regression|forecasting|learning) algorithms. Generally, I see the terms "recursive regression", "incremental learning", "streaming (machine) learning" or "online learning" used most frequently. Online learning may not be $O(1)$, as it intersects with a wider set of topics, but the first three should be very relevant and have $O(1)$ model parameter update and prediction computation times. There is a whole literature devoted to these topics, which is mostly the same topic, but split among engineering, computer science, and statistics researchers (who use different terms for the it).
The simplest you can get with this is recursive least squares, which is recursive ordinary least squares regression. It's $O(1)$ for model parameter updates and $O(1)$ for predictions. Python has the basic reference implementation in statsmodels, a very good statistics library that is widely used in Python.
After that, stepping up the complexity and power of the model a bit, you have Kalman filters, which are implemented in several Python libraries. The recursive, $O(1)$ implementations might be a little hard to find though, so here's a good description of the recursive Kalman filter and here's an implementation of the recursive Kalman filter.
Finally, stepping up a bit more, you can enter the domain of nonlinear recursive regression algorithms. I'll not say too much on this, as the topic gets very complicated somewhat quickly. But if you have the time and background necessary to delve into this, here's a classic paper that adds the nonlinearity by kernelizing recursive least squares:
Yaakov Engel, Shie Mannor, and Ron Meir. The Kernel Recursive Least Squares Algorithm. 2003
Finally, there are recursive versions of many of your favorite traditional time series forecasting algorithms. By that, I mean recursive autoregression: recursive ARMA, recursive ARIMA, recursive ARIMAX, etc. Some implementations in Python (e.g. in statsmodels' time series analysis section) might already be written recursively, but I'm not sure. There is a connection between ARIMA models and Kalman filters on differenced time series data which escapes me at the moment, but essentially, I'm not sure if you'll get that much more performance if you move to these instead of sticking with Kalman filters (and the more complicated extensions of them).
Finally, any algorithm has bounded computation time if you terminate the computation of the new model parameters early. If you're computing them by optimizing them via some iterative optimization algorithm when batches of new data come in, then you can terminate the optimization early, and bound your computation time. This is generally the kind of algorithm you find in the "online learning" field, and indeed, any machine learning algorithm can be trained like this (basically, you just retrain every time $k$ new data points have arrived, and terminate after training for $m$ minutes). This is, of course, not exactly constant time per unit of performance, and if you terminate too early, you may get a terrible set of parameters. But the increased space of models and the greater model complexity moving to the space of iteratively optimized algorithms buys you can make up for this, sometimes (for example, deep learning sits in this space--though obviously at the far end of it). Moreover, if you can get the implementation to start the optimization from the previous model's already-trained parameters, it will usually converge much, much more quickly (I like to use this "trick", though it's a very simple observation). A library that implements this kind of thing is creme-ml, which seems to be merging with scikit-multiflow at the time of this writing. Both claim to do machine learning on streaming data, which I would assume means you don't have to keep the old data (or the old predictions) around. I've never used either though, so YMMV.