Kepler said, in effect, that it was unnecessary to postulate angelic intervention to explain the moon's orbit. William of Ockham, an English Franciscan friar, was excommunicated for leaving Avignon without permission. One can suspect that the principle of parsimony that he espoused was not especially popular with establishmentarians.
In statistics, we would generally refer to models with supernumerary postulates as overfit or predictive/extrapolative, and testing for overfitting or extrapolation is not merely a matter of doing controlled experiments as comparison between two models does not address the parsimony or predictive accuracy of either one. As Einstein intimated, one should only postulate the least number of relevant variables that adequately explains the data, and no fewer. Note the implication of deterministic data interpretation. If a predictive model is based upon an assumption, then whichever prediction is being made either has to be related to some hypothesis that is verifiable and deterministic, or we are engaging in what physicists call 'hand-waving'.
Re: Olle Häggström's question "...illustrating the pitfalls of data mining and related techniques." This can happen if one fails entirely to postulate a pre hoc deterministic context. In this case, the numerical analysis invites no deterministic interpretation, and invites us to discount hypothesis testing, and as such the interpretation invokes faith, not science. Such efforts, if done conscientiously, are likely to be futile, and even the declaration of a miraculous event is often based upon faith-based determinism. As if we needed further on this, Newton spent most of his lifetime's intellectual activity looking for numerical structure within the bible, and yet was silent on that topic. It is plausible to ascribe that failure, along with Newton's secret failure to deduce chemistry having only alchemy as a starting point, to problems that he could not solve. Thus, even physicists, including Newton who was arguably the first physicist, can fall victim to their own unverifiable imaginings. Nor has this ever ceased to be problematic. For example, string theorists have faith that eventually something verifiable will eventually result from their current multi-dimensional exercises in pure mathematics, and so far, there is nothing of the sort.
Many modern theologians would argue against a literal interpretation of the mere wording of biblical Divine inspiration, as in the contrary case we would pay undue attention to concepts such as what constitutes a firmament (vault) in the heavens that opened to let the waters come forth upon the Earth (Genesis). It is plausible that the concept of the Earth's being late bombardment epoch supplied by water from ice asteroids formed in the region beyond the solar system's snow line, as it contains words that would have been anachronisms in biblical times, is a reasonable substitute for the rather more terse concept of a firmament. What is perhaps astounding, is that there are modern equivalents to "Let there be light" but I caution that that could refer to several events in modern cosmology, e.g., the ending of the dark universe several hundred million years after the big bang, or the illumination of the sun a mere 4.6 billion years ago. Furthermore, there is no general agreement on these concepts, as some interpretations of the bible are literal and deny the existence of anything older than 6000 years ago, again from counting up numbers as cited in the bible. Keep in mind that scribes hand copied text 2000 years ago, so that minor changes in actual text, as exemplified by the Dead Sea Scrolls, would have been inevitable, which brings up the question of how one can do precise numerology on paraphrased text. All of this is beside the point, which is that the bible has a clear evolution in time of what constituted ethical comportment, and that today, there are multiple versions variously accepted by different sects and religions as dogma, not all of which can have identical numerological significance.
There is a lot of wisdom in the bible, but given the preceding text, it is unlikely to be efficiently unpacked using numerology, as there is no unique textural version of the bible to refer to. Thus, my answer to the question Is there something related to statistics that we have learned, (including perhaps some interesting questions to ask) from the Bible Code episode [Sic, ?] is that most of the learning is from the negative side of the argument, i.e., what not to do, and why not. That does not mean that one cannot analyze the bible numerically, just that it is unlikely to yield very much in the way of useful information.
There are two types of information. Rather than say this directly, let us illustrate them in practice. George Box famously said "Essentially, all models are wrong, but some are useful." What that implies is that when we formulate a hypothetical explanation, what we have done does not illustrate any fundamental truth, but rather a "working hypothesis," that has the property of explanatory power that makes it useful. Such is the way of the investigator, condemned to perpetually search for words to express the truth, but words are, at best a convention.
What then is truth? Truth is not in words but rather behind them. In specific, each of us sees only our own personal world view, for example, we need faith to believe that life is worth living. Thus, truth is a first person phenomenon and words are not truth; they are shared values that by their nature have no insight of their own such that no matter how well we choose our words, it takes an individual, each of us, to actually have an insight that implies any feeling of mystery, wonder or appreciation. That is, we live in our own skins and our existence is a first person, subjective, phenomenon. Thus we can argue about whether in Exodus God told Moses "I am He who am," or the more literal translation "I am that which am," but the mystery is not in the words but rather in the first person context that inspired them.