It is possible that the way out of this crisis will be a scientific revolution and a change in the existing paradigm.

This puzzle is known as the strong CP problem.

In 1977, Roberto Peccei and Helen Quinn demonstrated that the very small value of θ can be explained by spontaneous breaking of the additional global symmetry they invented. At the same time, Frank Wilczek and Steven Weinberg independently from each other showed that in accordance with the Peccei-Quinn mechanism, there is a very light uncharged particle – the axion. Since then, they have been actively looking for her – so far to no avail.

One of the methods for experimental registration of axions. The figure in blue shows the estimated flux of axions emitted by the Sun, which are then converted in the Earth’s magnetic field (red) into X-rays (orange). These rays could be detected by the XMM-Newton space x-ray telescope. It is still unknown where to look for axions: they can be particles of dark matter or manifest themselves in the evolution of stars. Image © University of Leicester from eurekalert.org

– So, there are too many free parameters of the Standard Model, their values ​​look unmotivated and excessively scattered. But what does naturalness have to do with it?

S.T .: And we just approached her. In elementary particle physics, the principle of naturalness of theoretical models has a very specific meaning. It requires that all dimensionless free parameters either be equal to zero, or the order of magnitude does not differ too much from one – say, lie in the range from one thousandth to a thousand. The parameters of the Standard Model clearly do not meet this criterion. But there is an additional condition, which was formulated in 1980 by the remarkable Dutch theoretical physicist Gerard ‘t Hooft, one of the founders of the Standard Model. He postulated that a very small value of any free parameter receives a natural explanation only if its strict zeroing leads to the appearance of additional symmetry, which the equations of the theory obey. According to ‘t Hooft, the“ proximity ”of such symmetry serves as a kind of shield that protects the scantiness of this parameter from large corrections caused by quantum processes involving virtual particles. When I was a student and graduate student, all our science literally bloomed with this postulate. But this is still a weakening of the principle of naturalness, which we are discussing.

Gerard ‘t Hooft, Dutch theoretical physicist, one of the founders of the Standard Model. Photo from the site sureshemre.wordpress.com

The requirement that all free dimensionless parameters of the Standard Model, as well as the ratios of dimensional parameters, must be of the order of unity, is usually called absolute naturalness; on the contrary, ‘t Hooft’ naturalness is called technical (J. D. Wellls, 2013. The Utility of Naturalness, and how its Application to Quantum Electrodynamics envisages the Standard Model and Higgs Boson). The most radical supporters of absolute naturalness believe that the number of free parameters should also not significantly exceed one. According to this point of view, a theory that noticeably falls short of the ideal of absolute naturalness is deliberately incomplete and doomed to be replaced by a truly fundamental model of the physical world.

– What happens if you go beyond the Standard Model?

ST: Here, too, the problem of naturalness arises, albeit of a different kind. The most important dimensional parameter of the Standard Model is the vacuum average of the Higgs field. It determines the energy scale of the electroweak interaction, and the particle masses depend on it. Outside the Standard Model, there is a single equally fundamental parameter of the same dimension. This is, of course, the Planck mass, which determines the energy scale of the quantum effects associated with gravity. The Higgs field is about 250 GeV, which is twice the mass of the Higgs boson. The Planck mass is approximately 1019 GeV. So their ratio is either a very small or a gigantic number, depending on what to put in the numerator and what in the denominator. In fact, other interesting scales outside the Standard Model are discussed, but they are also immeasurably larger than the Higgs field. So here, too, we are dealing with an obvious strangeness, in other words, a lack of naturalness.

– So, maybe it is better to consider the principle as a natural relic of the science of the twentieth century and to abandon it altogether? It is not for nothing that some scientists talk about the onset of the postnatural era.

ST: Well, even a complete refusal will not solve all our problems. As I said, the principle of naturalness is something from the realm of aesthetics. But there are also experimental problems that will not go anywhere. Let’s say that it is now known for sure that the neutrino has mass, while the symmetries of the Standard Model require it to be strictly zero. The same is with dark matter – in the Standard Model it is not, but in life, apparently, it is. It is possible that if the experimental difficulties can be reasonably resolved, then nothing will have to be abandoned. But, I repeat, this whole problem complex is quite real and indicates the crisis nature of the current situation in fundamental physics. It is possible that the way out of this crisis will be a scientific revolution and a change in the existing paradigm.

– Sergei, what does the principle of naturalness mean for you personally? Perhaps even emotionally?

ST: For me, this is, in a sense, the principle of computability. Can we not just take from the experiment, but calculate all these 19 parameters? Or at least reduce them to a single truly free parameter? That would be fine for me. But so far this possibility is not visible. By the way, at one time many hoped that the main difficulties of the Standard Model could be sorted out on the basis of the concept of supersymmetry. However, even minimal supersymmetric generalizations of the Standard Model contain as many as 105 free parameters. This is already really bad.

– But for such a calculation you need to rely on something. As the saying goes, you don’t assume anything – you won’t get anything.

S.T .: That’s just the point. Ideally, I would like to have a comprehensive unified theory, which, at least in principle, will allow all the necessary calculations to be performed. But where can I get it? For many years, string theory has been proposed as a candidate for such a universal foundation. It has been created for almost 50 years, quite a respectable age. Perhaps this is a wonderful theoretical construction, but it has not yet taken place as a unified theory. Of course, no one is forbidden to hope that this will happen. However, in the history of physics, it rarely happened that a theory developed for half a century on promises of future successes, and then suddenly and in fact explained everything. I doubt it anyway.

True, there is a certain subtlety here from string theory, which implies the existence of about 10,500 vacua with different physical laws. Figuratively speaking, each vacuum must have its own Standard Model with its own set of free parameters. Numerous supporters of the anthropic principle argue that our own set does not require explanation, since in worlds with other physics there can be no life and, therefore, science. From the point of view of pure logic, such an interpretation is acceptable, with the exception that the scantiness of the parameter θ cannot be derived from the anthropic principle. This parameter could well have been more – from this the chances of the emergence of intelligent life on our planet would not have diminished. But the anthropic principle only announces the possible existence of an almost infinite set of worlds and is actually limited to this. It cannot be refuted – or, to use Karl Popper’s terminology, falsified. This is no longer science, at least in my understanding. To abandon the principle of falsifiability of scientific knowledge for the sake of a theory that in fact cannot explain anything seems to me incorrect.

– I can’t disagree. But let’s go further. How can you get out of the crisis – or, if you like, out of the pre-crisis of fundamental physics? Who has the ball now – the theorists or the experimenters?

S.T .: Logically, the ball should be on the side of the theorists. There are reliable experimental data on the mass of neutrinos, there are observations of astronomers that confirm the existence of dark matter. It would seem that the task is obvious – to come up with the foundations of a new theoretical approach and build specific models that allow experimental verification. But so far, such attempts have led nowhere.

Again, it is not clear what to expect from the Large Hadron Collider after its planned modernization. Of course, this machine will still receive a lot of data, and even now, far from all the information collected by its detectors has been processed. For example, there is evidence that electrons and muons are not entirely identical in their interactions. This would be a very serious discovery, possibly explaining the difference in their masses. But this evidence is still weak, you can trust them, or you can not trust them. This question will most likely be resolved in subsequent experiments at the LHC. However, it is worth recalling that the teams of experimental physicists who work on it have more than once reported hints of major discoveries outside the Standard Model, and later these announcements were refuted.

What is left? One can hope for super accelerators, which will be built someday, but with them everything is still unclear – at least for a 10–20-year perspective. So the ball is really on the side of astrophysicists. A truly radical breakthrough can be expected from this science.

– Why?

ST: The point is that it is not possible to find new particles involved in strong interactions. This means that we need to look for weakly interacting particles that are absent in the Standard Model. If they interact weakly, it means that they rarely interact, and the manifestations of such interactions need to wait long enough. We cannot wait long in accelerator experiments. But the Universe has been waiting for nearly 14 billion years, and the effects of even very rare interactions can accumulate all this time. It is possible that such effects will be found by astrophysicists. And there are already examples of this – after all, the presence of neutrino oscillations, demonstrating the nonzero mass of this particle, was discovered when studying solar neutrinos. These hopes are all the more justified because the observational base of astronomy and astrophysics is constantly expanding due to new ground-based and space telescopes and other equipment. Say, a year after the first direct registration of gravitational waves, it was proved that they propagate with the same speed as electromagnetic radiation. This is a very important result that speaks volumes for theorists.

Lecture by Sergei Troitsky "The Universe as a Laboratory of Particle Physics", delivered on October 8, 2017 at Moscow State University. M.V. Lomonosov at the Science Festival

– Sergei, since you mentioned space, let’s remember Johannes Kepler. In 1596, he noticed that the average radii of planetary orbits from Mercury to Saturn calculated by Copernicus were 0.38: 0.72: 1.00: 1.52: 5.2: 9.2. The distance between Mars and Jupiter seemed to Kepler too large, and therefore unnatural. He assumed that there was an as yet unknown planet, and was ultimately right. On New Year’s Eve in 1801, Giuseppe Piazzi discovered Ceres in this zone, which is now recognized as a dwarf planet. Of course, now we know that there is not one planet, but a whole belt of asteroids. Kepler had no idea about him, but I think he would hardly have been too surprised. In general, on the basis of the criterion of naturalness, a very specific prediction was made, which at first was justified literally, and later, if you like, with interest. Is something like this possible in fundamental physics today?

S.T .: This is not excluded. If we apply the naturalness criterion to explain the hierarchy of fermionic masses, then some new symmetry will almost certainly appear. Actually, to date, various candidates have been proposed for this role, but all of them somehow do not satisfy us. If such a symmetry can be found, it could lead us to as yet unknown particles. True, it will not work to predict them directly, like Kepler’s, but we will learn something useful. However, it is possible that in this case, too, useful instructions will be rather vague, with a colossal set of options. For example, the axion is predicted just on the basis of the new symmetry proposed by Peccei and Quinn. However, this mechanism allows very great freedom in the choice of parameters, and therefore we have no indication of where to look for the axion. It can be a particle of dark matter, or it can manifest itself in the evolution of stars or elsewhere – we just don’t know.

– Well, time will tell. And thank you very much for the conversation.

Time can really show a lot. Chinese physicists have designed a circular electron-positron supercollider capable of accelerating particles up to 240 GeV. According to the promulgated plans, it will cost $ 5 billion and can be put into effect in 2030 – of course, if the PRC authorities promptly finance this project. Such a machine would provide an unprecedented opportunity for producing Higgs bosons, which in turn would increase the chances of discovering physical effects beyond the Standard Model.

I also spoke with Gia Dvali, professor of physics at New York and Munich Universities and co-director of the Max Planck Institute for Physics (by the way, this renowned scientific center was created in 1914 as the Kaiser Wilhelm Institute of Physics, and its first director was Albert Einstein). Naturally, we talked about the same topic.

Georgi Dvali, professor of physics at the Center for Cosmology and Particle Physics at New York University and Ludwig-Maximilian University of Munich, director of the Max Planck Institute for Physics in Munich. Photo from depression argumentative essay the site astronet.ge

– Guia, how do you interpret the problem of the naturalness of the Standard Model?

GD: In general, I can repeat what Sergei said. The equations of the Standard Model include a set of free parameters that it cannot predict. The numerical values ​​of these parameters are very different from each other, and even if we are talking about seemingly similar objects. Take, say, a neutrino, an electron, and a t quark. All of them are fermions, but the mass of neutrinos is most likely no more than a fraction of an electron-volt, the mass of an electron is approximately five hundred thousand electron-volts, and the mass of a t-quark is 175 GeV – 175 billion electron-volts. Such differences may indeed seem somehow unnatural.

But this is only the outer side. To better understand everything, it is necessary to take into account the ultraviolet sensitivity of these parameters. We are talking about their dependence on an increase in the scale of energies – or, which is the same, on a decrease in the spatial scale. Let’s say that first we measure the mass of an electron in a laboratory, and then we look at what happens to it at Planck distances. With this approach, the parameters are divided into several groups. The maximum ultraviolet sensitivity is demonstrated by the energy density of the physical vacuum. In the Planck region, it is proportional to the fourth degree of scale change.