Monday, October 20, 2025

Superstring theory has in foundations tremendous "leap of faith" - why it is dangerous and why other approximations may be work better.

 When superstring theory was started the scientists made not one but two important assumptions: second one is that the primal, most fundamental entity is one-dimensional (string) and first one, which is extremely general is that such a fundamental, smallest, non-dividable further entity exists at all. The first assumption is by far more important because there is  a lot of physicists (me included) who are sure that no fundamental entity exists at all, all the phenomena are emergent (made from something smaller), quark, for example will be split one day and any present day idea about the smallest entity is nothing more but the approximation, fit of nature laws, sooner or latter being replaced by something more fundamental, then by even more fundamental and so on.  

In this paradigm the theory of everything is impossible, scientists are discovering not the utmost laws but merely an approximations, fits of the natural phenomena doomed to be obsolete one day. Something like it was an approximate theory of liquid drop nucleus, which is now replaced by shell theory and will be replaced by something stemming directly from quarks one day and so on and so on. In this consideration the superstring idea is nothing more than  a fit of smaller but not smallest entity. That would be a bad fit for this situation. This is because once the even smaller than superstring entity is proposed, the interaction of those entities would try to minimize the energy, make short the surface and finally break the string. Null-dimensional primal entities (fit for right now) looks much better approach merely because they already have the surface energy minimized (sphere is the entity with smallest surface). If instead of superstring the superballs would be considered, they one day may generate at least some useful insights into the organization of the nature on the smaller level like liquid drop nuclei model was able to predict some useful phenomena like fission or oscillations of the nucleus [1-4].

This is because the analog of surface tension exists on all levels of the organization of matter - from round shape of planets, to the droplet of water, to the shape of nucleus - whenever even smaller entities are present, the inevitability of interactions between them demands decrease of total energy through round shape. If the interaction between entities are anisotropic the shape may be different (crystal) but in this case the existence of such anisotropy hints onto the lower level of matter organization present.

Only if the researcher is making a tremendous "leap of faith" and assumes that the entity is the smallest possible one may he or she consider it in a shape of string or bran (one-dimensional or two dimensional). In this case the shape may be different - by definition it is smallest, but if the assumption is wrong nothing useful will ever appear - the underlying entities will be assembled into something like balls (recall nucleus) not strings, because the string (cylinder) would have the surface energy well above minimum. The whole work will be a complete bullshit. That is why it is much safer to "play" with smallest entities being considered as null-dimensional - they will be eventually considered as built from something smaller but will work as a useful approximation [1-4]


References.

1.Semi-empirical mass formula - Wikipedia

2.Shape of the atomic nucleus - Wikipedia

3.The Discovery of Fission - Moments of Discovery

4.A comprehensive view of nuclear shapes, rotations and vibrations from fully quantum mechanical perspectives


Wednesday, October 8, 2025

Relation between theoreticians and experimentalists from historical perspective. Why so frequently the "small" experiment precedes the theory.

 One of the reasons for the huge crisis in fundamental physics is that theoreticians strongly overestimate the ability of foresee the organization of the natural phenomena and strongly underestimate the "inventiveness" of Nature. They still sure that the same approach what was fine for quantum mechanics and special/general relativity will work forever in the future. The historical examples show that in many cases the small but crucial experiment would be necessary to "stir" the attention of theoreticians to the right direction. 

1.Superconductivity (BCS theory) [1]. While it was clear that some kind of quantum effect is responsible for superconductivity, the majority of theoreticians were sure that the key is in the electronic properties of metals (band structure, interactions between atoms in the lattice or similar). Only the seemingly small discovery of very small isotopic effect (transition temperature change due to different isotope) created understanding that phonons are somehow responsible (because the electronic properties of both isotopes are precisely the same, but the different mass of the nuclei will create a difference in sound propagation and this is the only way to correlate the change of critical temperature while isotope is changed).

2.JJ Thompson model of the atom rejection [2]. After discovery of electron it was clear that inside the atom there are positively and negatively charged entities. While electron was correctly represented at that time as a small charged ball (OK for the beginning of the 20th century) the positive charge was represented as a big cloud (quite a reasonable assumption since nobody ever deal with this). So the theoreticians were building theories with those assumptions - electron is a small ball but positively charged cloud is big. Only smart experiment of Rutherford (rarely observed deviation of alpha particles on the very high angle) forced theoreticians to think about positive charge as about something very small inside the atom, no other evidence for such consideration existed from previous knowledge.

3.Quantization principle. There were many smart mathematicians before 20th century. They were modifying Newton equations and deriving them from numerous underlying principles - Hamilton, Lagrange, Jacobi (Hamilton-Jacobi equation) but still made it compatible with classical mechanics. No one ever guessed about possibility of quantization - Plank essentially invented quantization to fit the experimental result of Kirchoff on black body radiation. Again seemingly small and unimportant result preceded the major theoretical modification. 

4.Replacement of liquid-drop model of nucleus to nuclear shell model [3]. After the discovery of the protons and neutrons the most obvious model widely accepted by theoreticians was liquid-drop model of nucleus. Indeed, the empirical introduction of some of the parameters like surface tension and strong force parameters allowed nicely fit numerous observations like elongated shape of the nucleus due to presence of "surface tension" [4] and predicted/explained fission. However, the seemingly not so important properties like hyperfine splitting in electron paramagnetic resonance (and existence of magnetic moment of some but not all nuclei as confirmed by nuclear magnetic resonance) allowed to confirm the presence of "magic numbers" in nuclei and thus lead to shift to shell model. 

5.Discovery of fission itself (with nuclear and hydrogen bomb invention) started from seemingly small experiment made by Fermi (nuclear transmutation, creation of isotopes 93 and 94) [5], which was initially interpreted as the simple addition of neutron to the nucleus present (quite an obvious explanation, reasonable from what was known at that time). It turned out that the original explanation was incorrect (Ida Noddack created correct one) and Otto Hanh following the correct explanation managed to discover the fission. This time it was a little more complicated that simple experiment shifting the theoreticians attention, but after all the experiment preceded bomb idea, no theoretician before Hanh would even think about such possibility. 

Again and again the powerful push forward in fundamental physics was initiated by seemingly small and unimportant experimental discovery but not by application of more developed theory based on first principles already known.  This contradiction has an unpleasant explanation - Nature is more inventive than theoreticians may perceive. At the present moment many theoreticians are developing different explanations to the James Webb Space Telescope data (like discovery of little red dots) but they all still in the same general direction - this is about space-time (inflation of space at the very beginning, primordial black holes, super-Eddington accretion, fluctuations during Big Bang created temperature pattern of microwave background and many other ideas - they all about space-time, its distortion, history, fluctuations, other properties).  History hints they are all wrong. It seems that physics is waiting for some small crucial experimental result which would be "orthogonal" to all theory knew and may imagine.



References.

1.BCS theory - Wikipedia

2.Plum pudding model - Wikipedia

3.Nuclear shell model - Wikipedia

4.The Discovery of Nuclear Fission | Jeremy Bernstein | Inference

5.Discovery of nuclear fission - Wikipedia