For some physicists, the simulation hypothesis would only be pseudoscience

0 26

In recent years, the simulation hypothesis has aroused some enthusiasm among the general public. This hypothesis, proposing that our universe is the fruit of a computer simulation generated by a higher civilization, brings together some renowned supporters like Elon Musk and the philosopher of science Nick Boström. However, not all physicists take it seriously. This is particularly the case of the theoretical physicist, author and video maker Sabine Hossenfelder, for whom this hypothesis is mainly pseudoscience.

According to the simulation hypothesis, everything we experience has been coded by an intelligent being, and we are part of that computer code. That we live in some sort of calculation in and of itself is not unscientific. For everything we know today, the laws of nature are mathematical, so we could say that the Universe is only calculating these laws. You might find this terminology a bit odd, but it's not controversial.

The controversial part of the simulation hypothesis is that it assumes that there is another level of reality where someone or something controls what we believe to be the laws of nature, or even interferes with those laws. The belief in an omniscient being who can interfere with the laws of nature, but for some reason remains hidden from us, is a common element of monotheistic religions. But those who believe in the simulation hypothesis claim that they arrived at their belief by reason.

The fuels that will power the ITER reactor are about to be tested

Philosopher Nick Boström, for example, argues that it is likely that we are living in a computer simulation based on a three-part argument. If there are yy) many civilizations, and those civilizations b) build computers that run simulations of conscious beings, then c) there are many more simulated conscious beings than real ones, so we are likely to live in a simulation. Elon Musk is also one of those who joined.

How to reproduce the physical laws by computer?

The problematic part of Boström's argument is that he assumes that it is possible to reproduce all of our observations using not the natural laws that physicists have confirmed with extremely high accuracy, but using an underlying algorithm. different, which the programmer executes. He implicitly claimed that it was easy to reproduce the fundamentals of physics.

But no one currently knows how to reproduce general relativity and the Standard Model of particle physics from a computer algorithm running on some kind of machine. We can get closer to the laws we know with a computer simulation (physicists do this all the time), but if this was how nature actually worked, we could see the difference.

The simulation hypothesis involves an algorithm capable of reproducing general relativity and the standard model of particles. However, we do not know of any process capable of reproducing such laws. 

Indeed, physicists looked for signs indicating that natural laws unfold sequentially, like in a computer code, but their search was unsuccessful. It is possible to tell the difference because attempts at algorithmic reproduction of natural laws are generally incompatible with the symmetries of Einstein's theories of special and general relativity.

The problem of distance scales

A second problem with Boström's argument is that for this to work, a civilization has to be able to simulate a lot of conscious beings, and those conscious beings will themselves try to simulate conscious beings, etc. It means having to compress the information we think the Universe contains. So Bostrom has to assume that it's somehow possible not to care much about the details in some parts of the world where no one is currently watching, and display them in case someone is watching.

Quantum holography to observe our cells even closer!

Again, it doesn't explain how this is supposed to work. What kind of computer code can actually do this? Which algorithm can identify conscious subsystems and their intention, then quickly fill in the required information without ever producing any observable inconsistency? This is a much more difficult question than Bostrom seems to suggest.

Climate models are a prime example. We currently do not have the computational capacity to resolve distances less than about 10 kilometers. But we can't just overlook all the physics below that scale. It is a non-linear system, so information from short scales propagates to large scales. If we can't calculate short-range physics, we need to replace it with something.

Want to remove all ads from the site while still supporting us ?

Getting this result, even roughly, is a big puzzle. And the only reason climatologists are pretty much right is that they have observations they can use to check if their approximations are working. If you only have one simulation, like the programmer in the simulation hypothesis, you cannot do it.

Video by Sabine Hossenfelder on the computer simulation hypothesis:

$ 0.00