19

Motivated by the discussion on whether chemistry can be reduced to physics, I came across a similar thread on Reddit, where a user commented:

A lot of chemistry has been reduced to physics, in the sense that you can perform long, expensive quantum mechanical simulations to reproduce chemical processes. This is called quantum chemistry or physical chemistry.

But much of chemistry necessarily involves a lot of atoms/molecules, and a lot of time-steps, before the phenomena become statistically significant (e.g. something involving small changes in pH). It's already a lengthy process to simulate just a few picoseconds of a dozen atoms interacting, and the difficulty scales exponentially with the number of particles (a little bit less with good approximations, but the point stands).

Quantum computers may help a little bit in the coming decades.

Chemical experiments and the laws of chemistry are time-tested and practical - we have given physical explanations for some of them, and probably would for the rest as well if it was possible to perform simulations at that level. But for now, for 99% of chemistry, it's not practical or useful to reduce it to physics.

From gazillions of atoms to complex molecules and biological systems, star systems, galaxies, and more, the universe essentially runs an advanced "simulation" of quantum mechanics, general relativity, and more—smoothly and seamlessly, without any apparent glitches. This makes me wonder: if simulating even a few atoms using the fundamental laws of quantum mechanics is computationally overwhelming, how does the universe manage to operate on such a vast scale?


EDITORIAL NOTE: originally I used the word "bug", but it led to misunderstandings, so it was finally replaced with the word "glitch." For a discussion on the conceptual difference between "glitch" and "bug" see this thread on Reddit.

15 Answers15

49

There is no indication that simulation needs to happen in real-time. One second for us could take 10^32 years to render and we would never know.

infinitezero
  • 551
  • 1
  • 3
  • 9
30

The simulation hypothesis doesn't seem to be justified (it's a claim with no explanatory or predictive power), so that puts the question of the computational complexity of a simulated universe to rest.

As for a non-simulated universe: As one example, you need to do an ungodly amount of maths to perfectly calculate the effects of multi-body gravitational forces for simulation. But in the real world, there is no such calculation. Each body exists as a distinct entity that bends spacetime to influence the bodies around it. It's something that's actually happening, not something where you have to simulate every interaction between all bodies.

One might ask what's the ultimate foundational "stuff" that the universe is made of, that makes all of this work... but I don't know if that question even makes sense, never mind anyone actually having the answer. We know about atoms and spacetime and quantum fields and stuff, and that's about the deepest that our knowledge goes.

Those things work the way they do, and that's about all we can say at this point. We'd just be speculating about anything "below" that. Maybe in future our knowledge would go deeper, but then that might just push the question down to whatever the things further down are.

NotThatGuy
  • 17,748
  • 1
  • 29
  • 62
18

You ask:

how does the universe manage to operate on such a vast scale?

Nomological constraints are not lines of software that execute. Under the current notions of physicalism (SEP), scientific laws (and that's not a term that has much currency these days) are descriptions of observed regularities, not edicts that some super intelligence has programmed into the universe. From the article:

Physicalism is, in slogan form, the thesis that everything is physical. The thesis is usually intended as a metaphysical thesis, parallel to the thesis attributed to the ancient Greek philosopher Thales, that everything is water, or the idealism of the 18th Century philosopher Berkeley, that everything is mental. The general idea is that the nature of the actual world (i.e. the universe and everything in it) conforms to a certain condition, the condition of being physical. Of course, physicalists don’t deny that the world might contain many items that at first glance don’t seem physical — items of a biological, or psychological, or moral, or social, or mathematical nature. But they insist nevertheless that at the end of the day such items are physical, or at least bear an important relation to the physical.

Thus, it is common for those who practice modern science to adopt the presumed metaphysical thesis that the universe is only physical stuff in the sense of matter and energy, or things that reduce to matter and energy, often called supervenience (SEP). Eliminative materialists insist that mental things don't really exist, and property dualists prefer to think of mental things as being properties of physical substances. But in all of these cases, the various sciences, which might best described as a collection of naturalized epistemologies (SEP) largely are the beliefs of physicalists such as cosmologists to describe how the universe is organized. Under these views, there is no physical evidence the universe is a simulation, though it makes for good science fiction and philosophical discourse.

There are those, like Seth Lloyd who advocate the view that the universe is computed (Edge.org) and have written books about it, and those like John Archibald Wheeler and his Participatory Anthropic Principle who push ideas that information is more fundamental than physical substances. This is often expressed as "it from bit", suggesting that physical substances are fundamentally information in composition. But these views go against modern thinking with regards to the relationship between information, computation, and physical systems (SEP). Computers are seen as abstraction of physical media.

Thus, you raise a great challenge to the simulation hypothesis. It seems overwhelming to imagine what sort of computer would be required to simulate even small sections of the physical universe, an object Daniel Dennett has raised a number of times himself in works like Consciousness Explained. Thus, the modern view is that "Laws of Science" are just descriptions of how physical substances behave, and do not determine the behavior. Some characterize the latter line of thinking "mistaking the map for the territory" or call it the reification fallacy. To wit, from WP:

Reification (also known as concretism, hypostatization, or the fallacy of misplaced concreteness) is a fallacy of ambiguity, when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete real event or physical entity.

J D
  • 40,172
  • 4
  • 32
  • 138
15

I actually agree with Lowri on this one, but I'll play Devil's advocate and argue the 'simulationist' viewpoint for fun.

If you were going to create a simulated universe on a computer, would the 'people' in the simulated environment have enough computational power (within the simulation) to replicate the universe that they exist in? I think it's obvious that they cannot because that would mean they have as much computational power as what it takes to create their own universe. So you say, OK, I'll provide them with that. But now, they simulate another universe within theirs. You need at least 3 times what you started with. And then that simulation within a simulation creates another universe. You should see where this is going: infinite computing power. So, it should not be surprising for the occupants of a simulated universe to wonder at the computing power required to create their own universe.

Again, I don't really believe any of this, but I think it's fun to discuss.

I do want to recommend a book that I consider highly underrated: The Planiverse by a A.K. Dewdney. It made me consider how our universe might appear to a 4-dimensional (spatial) being.

12

The universe is not running an advanced simulation. The events and items in the universe are what it is. There is nothing for the universe to "pull off." It could not do otherwise.

Lowri
  • 12,448
  • 3
  • 24
  • 55
12

This makes me wonder: if simulating even a few atoms using the fundamental laws of quantum mechanics is computationally overwhelming, how does the universe manage to operate on such a vast scale?

Simple. It's massively parallel. If we had machines with the same degree of parallellism (plus data capacity), we could create a simulation that would run just as fast (keeping the same level of detail). The only machine for this that we currently have is the universe itself.

You seem to be forgetting that in each simulation the level of detail matters. There is a trade-off between speed and amount of data that can be processed (which is equivalent to level of detail generated).

(In our universe we are used to seeing a particular level of detail. No measurement has infinite precision. If it would suddenly turn out that all physical constants, when measured with much greater precision than now, all ended with long strings of 0's, we should get a bit worried... But I'm sure that would then justify funding the SESI project, the Search for Extra Simulation Intelligence, exploring the outer fringes of the decimal expansion of π and trying to decipher any messages contained in those. Or course, philosophers always want to kick up some trouble, so we would get arguments saying that π, despite being transcendental, does not contain messages; we should look in e, which is far more likely to contain a hidden message since it's connected to "life" processes. The physicist will keep saying, however, that we should try to increase the precision of the measurements again, and see what we get. Perhaps the 0's will turn out to be a fluke.)

For an early exploration of how to make universe creating/simulating machines and how to become gods, see Stanislav Lem's hilarious book Summa Technologiae.

mudskipper
  • 9,221
  • 1
  • 12
  • 39
6

This question assumes a privileged view of the time dimension of the universe, i.e. that the universe evolves over time in a causal, step-by-step fashion that mimics our perception of time and causality.

There are ways this premise could be false. Perhaps everything that ever happens, ever has happened, and ever will happen is simply one fixed solution to a system of equations. There is no change that needs simulation. We merely perceive different aspects of this reality at different times and places.

usul
  • 719
  • 3
  • 7
5

The simulation doesn't need to simulate the entire universe's gazillion particles, quantum mechanics, general relativity and all the bells and whistles. It only needs to simulate enough to convince the simulant referring to itself here as user80226 that it's real.

Relatively speaking it doesn't need to simulate much at all, it need not even simulate your local environment, just your mind and your perceptions.

Lag
  • 151
  • 4
4
  1. In former times the favorite answer to your question “how does the universe manage to operate on such a vast scale?” was the theistic answer:

    • A divine intelligence operates the universe
    • or a divine intelligence created a universe with an intrinsic process mechanism.
  2. But this kind of answer does not rest on a sound philosophical argument. Because the answer promps the subsequent question: Is there a a divine intelligence and - if yes – how does it operate?

    From the viewpoint of science the original question has no answer. Possibly the question is ill-posed.

Jo Wehler
  • 52,498
  • 3
  • 47
  • 139
3

If you wanted to simulate how different object interact, you'd need to figure out what happens, you'd need to formalize that and ultimately you'd need to compute how every object interacts with every other object for each time step.

While if you just wanted to know how the real world objects interact, then all you'd have to do is: wait.

It's like the difference between describing what you see in it's full glory and actually seeing it. The latter literally takes a blink of an eye, the former could take years and fill tomes depending on the level of detail.

Or if you pick up a pen and draw random lines on a piece of paper that might take you a few minutes while if you gave that to experts and task them with recreating a perfect copy of that, down to the level of the atoms, that might take forever or not be possible at all.

So perfect replications or simulations are often a lot more difficult than just doing the real thing.

And with regards to computational complexity of simulation you might also take a look at how we deal with that already. So in order to save resources you could tune down the complexity. So rather than complex molecules and interactions between gazillions of particles you abstract idk entire planets to (x,y,z, weight) coordinates for example. So less fine grained resolution. Or you could localize the observation and only compute what is directly visible while treating the rest only as probability distribution if at all. Also obviously the "hardware" is outside of the universe so why should it be limited by the restrictions of the universe? Like our computers are like building turing machines inside of minecraft for example, they are rudimentary and lack power if you compare them to a modern gaming pc that is million times bigger and faster than trying to recreate that within a program.

So the computer running the universe (if such a thing exist) could have dimensions that vastly exceed the universe, that it is running. And it doesn't need to be just one computer, but each particle could be it's own computer in a network interacting with others directly or via a server or whatnot.

Though with regards to simulations and reality you would get into trouble with continuity. Like no matter how powerful your simulating machine is, if you zoomed closer and closer and closer at some point you'd see pixels or quanta of time because the the complexity of the universe scales with the number of particles and the resolution of interaction and that can get HUGE really quick. While if things don't get computed but just are and interact with each other directly and not computed by a central processing unit keeping track of everything, then this would be far less of a problem and would only require the universe itself and no additional hardware.

haxor789
  • 9,449
  • 13
  • 37
2

If we want to think of the universe as a computer, then we ought to think of it as a massive scale quantum computer. In other words, if we pour an acid solution into a base, then each atom in the acid, base, and solvent is a separate qubit, all resolving their superpositions in real time. If and when we create a quantum computer with an Avogadro's number of qubits (as opposed to the 1000 or so qubits our current machines max out at) then we'll start being able to simulate simple chemical interactions.

Ted Wrigley
  • 31,166
  • 3
  • 25
  • 85
2

the universe essentially runs an advanced "simulation" of quantum mechanics, general relativity, and more—smoothly and seamlessly, without any apparent bugs.

The universe isn't hardware with the physics being software it runs. In no sense is the universe is a simulation. You've merely asserted the case without any argument for it. Materially speaking, the universe is its physics. There is no meaningful distinction between the two.

That software can simulate a part of the universe does not entail that the universe is a also a simulation. Another way of putting it is that a photograph of you is not you.

Mozibur Ullah
  • 49,540
  • 15
  • 101
  • 267
2

You have to remember that all computers we have built are a very tiny part of the universe, a very tiny part of the galactic cluster where we are, a very tiny part of the galaxy, a very tiny part of the solar system, a very tiny part of one planet.

Consider this: the mass of our universe is 1053 kg. The electrical parts of a single computer (CPU, memory, flash storage chips) are few hundred grams. Sure, there are mechanical parts like cooling fins and case but they don't take part in the computation.

So our universe is more than 53 orders of magnitude bigger than a computer.

I don't find it surprising at all that a computer has trouble running physics simulations, since it's so small. You can't expect so tiny part of our universe to run useful simulations.

Another problem is that computers aren't quantum. Semiconductor physics is affected by quantum mechanics, but the quantum effects are not utilized in computations. If our universe is a simulation, it's almost likely running on something quantum.

When we'll finally have working quantum computers, we can see that a quantum computer can simulate something small, like predicting how certain medicines work chemically.

But if the weight of the computing parts (not the cooling parts) of a quantum computer are measured in kilograms, it's still true that our universe is 53 orders of magnitude bigger. So while a small quantum computer can simulate small quantum systems, you can't for example expect such a tiny part of our universe to simulate the entire universe.

juhist
  • 129
  • 1
1

An emulation is a crutch, is walking with ice skates on concrete, is making a vegan omelette — the universe, by contrast, is running natively on bare metal. Have you ever emulated an 80 bit floating point division with integer-only hardware?

That's the difference, just much more fundamental (after all, both ints and floats are still binary numbers, that is, very similar!).

1

"...without any apparent bugs."

I think this is a useful statement with which to tease apart what's happening. When a computer program has a "bug", it means it's not working as intended. Some unforeseen circumstances, maybe unpredicted inputs, maybe programmer error, maybe a mechanical fault in the computer's hardware, cause the program to work in way the user or programmer didn't intend. But given a more thorough understanding of the state of the computer, nothing about the bug will be a mystery.

With this in mind, what would a "bug" in physics even be? You'd need to know what the universe and physics were intended for to say some phenomenon in it is a bug. You might propose instead that a "bug" in physics is when something breaks known physical laws, but this doesn't really work either. It doesn't prove physics broke, it proves our model of physics was incomplete, and a more thorough understanding of reality would explain it fine. There's a kind of "God of the gaps" thing going on here.

Ryan_L
  • 954
  • 1
  • 7
  • 15