According to Bishop in Pattern Recognition and Machine Learning:
Boosting can give good results even if the base classifiers have a performance that is only slightly better than random, and hence sometimes the base classifiers are known as weak learners.
See also the What is meant by 'weak learner'? thread.
You can replace “classifier” with “algorithm” or “regressor” and the definitions would not change their meaning. In fact, people often use classifiers as a general case in machine learning literature and generalize from it to regression, or sometimes loosely use “classifier” to mean machine learning algorithm in general. This is also what Bishop does:
Boosting is a powerful technique for combining multiple ‘base’ classifiers to produce
a form of committee whose performance can be significantly better than that of any
of the base classifiers. [...] Originally designed for solving classification
problems, boosting can also be extended to regression [...]
What we use as weak learners for regression are often regression trees vs classification trees for classification.