This is a term that is specifically from empirical Bayes (EB), in fact the concept that it refers to does not exist in true Bayesian inference. The original term was "borrowing strength", which was coined by John Tukey back in the 1960s and popularized further by Bradley Efron and Carl Morris in a series of statistical articles on Stein's paradox and parametric EB in the 1970s and 1980s. Many people now use "information borrowing" or "information sharing" as synonyms for the same concept. The reason why you may hear it in the context of mixed models is that the most common analyses for mixed models have an EB interpretation.
EB has many applications and applies to many statistical models, but the context always is that you have a large number of (possibly independent) cases and you are trying to estimate a particular parameter (such as the mean or variance) in each case. In Bayesian inference, you make posterior inferences about the parameter based on both the observed data for each case and the prior distribution for that parameter. In EB inference the prior distribution for the parameter is estimated from the whole collection of data cases, after which inference proceeds as for Bayesian inference. Hence, when you estimate the parameter for particular case, you are use both the data for that case and also the estimated prior distribution, and the latter represents the "information" or "strength" that you borrow from the whole ensemble of cases when making inference about one particular case.
Now you can see why EB has "borrowing" but true Bayes does not. In true Bayes, the prior distribution already exists and so doesn't need to be begged or borrowed. In EB, the prior distribution has be created from the observed data itself. When we make inference about a particular case, we use all the observed information from that case and a little bit of information from each of the other cases. We say it is only "borrowed", because the information is given back when we move on to make inference about the next case.
The idea of EB and "information borrowing" is used heavily in statistical genomics, when each "case" is usually a gene or a genomic feature (Smyth, 2004; Phipson et al, 2016).
References
Efron, Bradley, and Carl Morris. Stein's paradox in statistics. Scientific American 236, no. 5 (1977): 119-127. http://statweb.stanford.edu/~ckirby/brad/other/Article1977.pdf
Smyth, G. K. (2004). Linear models and empirical Bayes methods for assessing differential expression in microarray experiments. Statistical Applications in Genetics and Molecular Biology Volume 3, Issue 1, Article 3.
http://www.statsci.org/smyth/pubs/ebayes.pdf
Phipson, B, Lee, S, Majewski, IJ, Alexander, WS, and Smyth, GK (2016). Robust hyperparameter estimation protects against hypervariable genes and improves power to detect differential expression. Annals of Applied Statistics 10, 946-963.
http://dx.doi.org/10.1214/16-AOAS920