The quotes from other researchers in this are all too diplomatic and leave me wanting a post on it from @skdh https://t.co/E1qYwasjY2

— Chad Orzel (@orzelc) January 19, 2017

An article in the Physics World promoted an April 2016 paper by Josset, Perez, and Sudarsky recently published in PRL

Dark energy as the weight of violating energy conservationthat has claimed the the apparently observed cosmological constant is just the accumulated amount of energy that was created when Nature violated the energy conservation law – and that's supposed to make things more natural.

The 97% crackpot Lee Smolin praised the idea as a speculative approach in the best possible sense that is revolutionary if true. The 60% crackpot George Ellis said that the proposal was viable and no more fanciful than what's being explored by contemporary theoretical physicists – his English isn't as good as mine so I had to improve this man's prose.

Orzel found these comments too diplomatic and, as a "progressive" (a far left whacko), he decided to look for the best possible debunker with the only politically correct number of penises (zero) who should debunk this stuff: Sabine Hossenfelder.

Unfortunately, the politically correct number of penises often has additional consequences that Orzel must have overlooked. So instead of debunking the stuff, Hossenfelder wrote an essay saying

Yes, a violation of energy conservation can explain the cosmological constantYes, geniuses in the NASA basements could have constructed a self-propelling spaceship, too. None of these two closely related claims sounds convincing, however. It's not surprising that Hossenfelder's attitude isn't too far from Smolin's – after all, the two happily collaborated for quite some time.

I think that these cheap ideas show the deterioration of the kind of "theoretical physics surrounding quantum gravity" that is manifested whenever the researchers are allowed not to be experts in string theory. Without string theory, thinking about the physical phenomena going beyond the effective field theory picture almost unavoidably reduces to pure speculation and random sacrifices of cherished principles. These people may be good at throwing important things to the trash can – string theory, energy conservation law etc. – but they never have anything good to replace the things they have thrown away with.

What are Josset et al. doing? First, Einstein's equations of general relativity say\[

R_{\mu \nu} - \tfrac{1}{2}R \, g_{\mu \nu} + \Lambda g_{\mu \nu} = \frac{8 \pi G }{c^4} T_{\mu \nu}.

\] It's an equation for a symmetric tensor. Cannot these components be split to several pieces? Yes, there is a natural enough way to split the equations into two pieces: the trace and the rest. The rest is the traceless part.

If you take the trace of Einstein's equations above, i.e. its product with \(g^{\mu\nu}\) summed over the two indices, you will get\[

R(1-D/2) + D\Lambda = \frac{8\pi G}{c^4} T

\] where the spacetime around us has \(D=4\). Here, \(R\) and \(T\) are the traces of \(R_{\mu\nu}\) and \(T_{\mu\nu}\), respectively. You may calculate the traceless part by subtracting \(g_{\mu\nu}/D\) times the trace from the original Einstein's equations. For \(D=4\), we get:\[

R_{\mu \nu} - \tfrac{1}{4}R \, g_{\mu \nu} = \frac{8 \pi G }{c^4} (T_{\mu \nu} - \tfrac{1}{4} Tg_{\mu\nu})

\] You may derive the trace and traceless part of Einstein's equations from the principle of least action. How? You simply consider separate variations of \(g_{\mu\nu}(x^\alpha)\) that preserve the determinant of the metric at each point (you get the traceless equations in this way); and the variation of the scalar factor (that's how you get the equation for the trace).

Fine, the "reduced" Einstein's equations where you only discuss the traceless tensor – from varying the metric while preserving its determinant – is known as unimodular gravity.

Great. So the cosmological constant doesn't appear in the traceless part at all. It only affects the trace part. The covariant derivative (or divergence) \(\nabla^\mu\) of the original Einstein's equations is identically zero. Note that the covariant derivative of the metric (and therefore the cosmological constant term) is identically zero, much like the covariant derivative of the Einstein tensor. The covariant derivative of the stress-energy tensor vanishes if the equations of motion for the matter field are obeyed.

But when you treat the trace and traceless equations separately, this automatic vanishing of the covariant derivative disappears. So you may decide not to impose the trace part of Einstein's equations (which contains \(\Lambda\)) at all. Instead, you may try to derive this condition from \(\nabla^\mu T_{\mu\nu}=0\). But once you have this continuity equation, you may change it to\[

\nabla^\mu T_{\mu\nu} = J_\nu = \nabla_\nu Q

\] There's a current \(J_\nu\) which measures the violation of the energy conservation law and you may also decide that it should be equal to some gradient of some scalar \(Q\). Great. There is of course no justification for any of these things. You are just randomly abolishing some equations of physics. There is no good conceivable source for any nonzero \(Q\) – Josset et al. therefore enumerate some of the crackpots' most popular ways to bastardize physics, namely modifications of quantum mechanics and spacetime discreteness at the Planck scale.

None of these things is really consistent, let alone well-defined according to some known rules, so these excuses are equivalent to enumerating some tooth fairies and Harry Potters, but they don't care.

You can see that the effect is nothing else than some hypothesized contribution to \(T_{\mu\nu}\) that, after some time, becomes equivalent to some component of some \(Q\).

There is a way to improve this whole theory and eliminate all the nonsensical "violations of the physical laws" while keeping the predictions exactly the same. You just postulate that \(\Lambda\), the cosmological constant, is no longer constant. Instead, it is some variable field – quintessence, if you wish – whose value may evolve due to the very same effects that were driving \(Q\) above. Great, so some mysterious effects – tooth fairy, violations of postulates of quantum mechanics, global warming, discreteness of the spacetime, or any pseudoscience that your New Age religious cult considers fashionable right now – just make the current value of the vacuum energy density \(\Lambda\) equal to \(10^{-123}\) in the Planck units.

Have you solved anything? I don't think so. You have just

*parameterized*the problem in some way and added some implausible supernatural phenomena as "possible" explanations of the problem. But you haven't actually made any steps towards proving that the problems have been solved in any way. And you haven't provided the readers with any evidence that your

*additional hypotheses*are right. So you have just confined yourself into a less likely axiomatic system than you started with. The problem is worse than it was before you tried to do something.

If I quote Feynman's famous cargo cult commencement speech:

There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.That's exactly what Josset et al. – and many others – don't achieve. They determine that if a tooth fairy is behind the cosmological constant, she must have \(10^{123}\) times longer teeth than her legs. But it's just a parameterization of the pre-existing problems within a particular framework that makes extra assumptions: nothing new comes right from the "theory". The theory is scientifically worthless – it's just a way to decorate the numbers and problems with some arbitrary new words and visions. But science must also produce results, i.e. reduce the number of unexplained independent mysteries or parameters or observations.

At Hossenfelder's blog, Haelfix wrote:

It's not entirely settled whether Unimodular gravity differs from GR's prediction at the quantum level. This goes back and forth endlessly in the literature.Amen to that.

At the very least, its not clear what you gain when trying to solve the cosmological constant problem. There is still a finetuning problem, the difference is -they say- that there is only one number to explain, and not an entire renormalization tower of unknown physics which tends to drag you (order by order) towards a Planckian value.

It's questionable whether you may separately modify the trace and non-trace at all, especially in quantum gravity. Needless to say, you need a really consistent theory of quantum gravity to approach any such question really meaningfully. String theory – the only known and probably the only mathematically possible framework that achieves that – doesn't allow you to treat the trace separately from the rest – there is no "unimodular gravity" according to string theory. Outside string theory, there are no well-defined rules which is a way to understand that the people answering these questions without string theory can't agree about the answer.

(All the "critics" love to say all the nonsense about string theory's making no predictions. Unimodular gravity is just one example among many: Without string theory, you can't say anything about its validity. With string theory, you can make things sharp. If string theory is right, unimodular gravity is not. String theory answers all such qualitative "are you allowed to modify this or that" questions.)

But Haelfix pragmatically measures the "progress" in the second paragraph. The fine-tuning problem for the cosmological constant isn't really solved because you haven't explained why the apparent cosmological constant has converged to the observed tiny value after all these uncontrollable violations of the conservation law. And when the cosmological constant problem is approached consistently, it differs from problems with non-renormalizability of theories because the vacuum energy term is maximally "relevant" (dimension zero operator) so it doesn't produce a tower of high-dimension operators.

In effective field theory, you just need to adjust one term, the bare cosmological constant, and everything is fine. It's still true that you need such an adjustment in the fairy-tale by Josset et al. They haven't improved anything that is linked to the predictions. The ultimate question is why a theory that goes beyond the effective field theory – e.g. string theory – where the cosmological constant isn't adjustable is predicting the value we are observing. The story by Josset et al. isn't helping to solve this actual problem at all. It just parameterizes the problem in terms of some specific hypothesized tooth fairies that don't seem to be helpful in making things better.

By the way, Hossenfelder gave a "cute" response to Haelfix's comment:

And, yes, what Haelfix says above is correct, there is a long back and forth in the literature about whether or not quantizing unimodular gravity helps with the cosmological constant problem by taming vacuum fluctuations, but the calculations in the paper above doesn't depend on the quantization.Calculations in a paper may be independent of effects in quantum gravity but if quantum gravity prohibits assumptions or results of the calculation (such as unimodular gravity as an allowed inequivalent theory), then the calculations in the paper are clearly irrelevant for or inapplicable to our Universe which follows quantum, not classical, laws at the end. In other words, the paper is strictly worthless because the approximation it uses breaks down for the purpose where it's used.

I feel some

*deja vu*. When the cosmological problem was considered the hottest problem in physics around 2000, many solutions were proposed and the "violation of the energy conservation" was probably one of them. I can't remember who proposed it at that time and I don't think it's important or he or she deserves some credit. But this is just another example showing that if someone is trying to do research of physics without looking for any actual "laws", and without taking some "laws" seriously enough, he won't see any progress. He will just randomly oscillate back and forth – without any way to determine the positive and negative directions – in the landscape of speculations. This is not what science should do which is why all the people should be expected to know the state-of-the-art framework to address all such questions, namely string theory. Even if someone could find "something else" or a "problem with string theory", it could still be possible. But no one should be supported for some Brownian motion in the landscape of speculations.

## No comments:

## Post a Comment