A Simple Theory of LENR (Axil Axil)

The following post has been submitted by Axil Axil

A Grand Unified Theory(GUT) is a model in particle physics in which at high energy, forces of the Standard Model which define the electromagnetic, weak, and strong interactions, are merged into one single force. This unified interaction is characterized by one larger gauge symmetry and thus several force carriers, but one unified coupling constant. A common coupling constant means that each of these forces can affect the other. If Grand Unification is realized in nature, there is the possibility of a grand unification epoch in the early universe in which the fundamental forces are not yet distinct.

There might well exist processes in condensed matter that can amplify and concentrate EMF to a high enough level to achieve a unified force coupling constant. In such a high energy state, the electromagnetic force would affect both the weak and the strong force.

One of the predictions of the Grand Unified Theory is the decay of the proton and the neutron. Most Grand Unified Theories predict that free protons will decay. They also predict that neutrons will decay by essentially the same process,

To prove that the Grand Unified Theory was valid, a hunt for proton decay began in the 1980s. To complete and verified theory of the standard model of physics, much rests on the existence of proton decay, and yet to this very day, we’ve never seen a proton die. The reason may simply be that protons rarely decay, a hypothesis borne out by both experiment and theory. Experiments say the proton lifetime has to be greater than about 10^^34 years: That’s a 1 followed by 34 zeroes.

The key phrase in that last sentence is “on average.” Because of quantum physics, the time any given proton decays is random, so a tiny fraction will decay long before that 10^^34-year lifetime. So, what you need to do is to get a whole bunch of protons together. Increasing the number of protons increases the chance that one of them will decay while you’re watching.

The second essential step is to isolate the experiment from particles that could mimic proton decay, so any realistic proton decay experiment must be located deep underground to isolate it from random particle passers-by. That’s the strategy pursued by the currently operating Super-Kamiokande experiment in Japan, which consists of a huge tank with 50,000 tons of water in a mine. The upcoming Deep Underground Neutrino Experiment, to be located in a former gold mine in South Dakota, will consist of 40,000 tons of liquid argon.

Because the two experiments are based on different types of atoms, they are sensitive to different ways protons might decay, which will reveal which GUT is correct … if any of the current models is right. Both Super-Kamiokande and DUNE are neutrino experiments first, but we’re just as interested in the proton decay possibilities of these experiments as in the neutrino aspects.

Proton Longevity Pushes New Bounds

One interesting paragraph in this article is revealing. “Other GUTs that incorporate supersymmetry (SUSY), a hypothetical model that assumes all particles have a partner with different spin, predict that the proton decays into a K meson and a neutrino with a lifetime of less than a few times 10^^34 years. The Super-Kamiokande collaboration has looked for signs of this decay in a 50,000-ton tank of water surrounded by detectors. If one of the many protons in the tank were to decay, the KK meson’s decay products (muons, ππ mesons) would be detectable.”

These particles are seen in Leif Holmlid’s experiments. Are proton and neutron decay occurring in vast numbers in Holmlid’s experiments? What could cause the decay rate of protons to hugely increase there?

It might be the unified coupling constant. As the power and focus of the EMF increases, the various individual force cooping constants converge to the unified value. Then the probability of proton decay goes up in proportion. One of the perplexing characteristics of the LENR reaction is its wide range of apparent power from extremely week to very strong. A varying strength of the EMF field would supply that character to the LENR reaction.

Another amplification seen in LENR is the speed at which nuclear decay happens. In a LENR reaction the decay rare can be so rapid that a radioactive isotope reaches stability almost instantaneously. In a weak LENR reaction, the isotope’s production of radiation is only affected slightly. This may be a result in the increase of the Weak Force cooping constant through EMF stimulation as it is amplified in varying amounts toward the Weak force unification value.

The next step in our explanation of LENR is to understand what processes in condensed matter produces powerful and focused EMF strong enough to unified the common force coupling constant.

After all, proton decay id a mainstay of the theories of how the universe works and follows from profound universally accepted concepts of how the cosmos fundamentally operates. If protons do decay, it’s so rare that human bodies would be unaffected, but not our understanding. The impact of that knowledge would be immense, and worth a tiny bit of instability. But that instability opens up access to the limitless power stored inside the atom through the recovery of the LENR reaction.

Axil Axil

  • Axil Axil

    The decay or the proton of the neutron can produce a huge amount of energy, far more than fusion. The mass of the nucleon(proton and neutron) is 939 MeV. After decay, this energy is reformulated into sub atomic particles call mesons. These mesons decay in a chain until they get down to a stable particle call the electron. The electron is only .512 MeV so almost all of the nucleons mass is converted to energy through particle decay.

    Meson decay is where all those electrons that the Quark is producing come from. The last meson in the meson decay chain is the muon. The decay of the muon is where the electron come from.


  • Fedir Mykhaylov

    Please Axil. For more information about proton decay experiments Holumlind

    • Axil Axil

      Yes, Holmlid sees billions of negative muons in his experiments, and if anybody elso who has a active LENR reaction going, they will see the same production of muons as Holmlid himself predicts.

      • Zephir

        This is nice and all, but the formation of muons and decay of hadrons isn’t promoter of cold fusion, but actually a parasitic process of it which is running at higher energy densities and which decreases the total energy yield of it.

        The corelation doesn’t imply causation and in this case you even have it opposite, i.e. you’re promoting retrocausality. The confused mainstream physics is full of such retrocausalities, for example with respect to explanation of pseudogap phase for superconductivity. The physicists already know, that the pseudogap phase occurs often together with superconductivity – so that they’re deducting, that the pseudogap phase serves as a promoter of it. The situation is exactly the opposite, though – the pseudogap is parasitic, i.e. competing effect for superconductivity. Just the fact, that the parasites occur around rich people often doesn’t imply, that these parasites are important for their richness – on the contrary. Got it?

        • Axil Axil

          Word salad

          • Zephir

            The saying, the LENR can produce the energy by splitting of stable particle like the proton into a muons is like the claiming, the camp fire gets its heat from burning of carbon dioxide. Many people did observe the presence of carbon dioxide above camp fires – so where is the problem with this straightforward experimental evidence?

  • Axil Axil

    This decay of the proton is not a new idea.

    Alan Smith wrote:
    This comment on proton decay -or rather transformation – comes from a previously unpublished note from Professor Sergio Focardi. Translated from Italian of course -fairly badly by me, but the meaning is -I hope- clear.

    …..N = P + e^- + av,where e ^-is a negative electron and av is an antineutrino; radiation persists for a few minutes in the system, then they annihilate into thermal energy, no radioactivity. To determine the transformation of (elements), transforming the Proton, with emission of energy corresponding to the loss of matter is the particle that Enrico Fermi called W, which was later identified by Carlo Rubbia, work that earned him the Nobel Prize. On this system the W particle is triggered by the excitement of hydrogen atoms….. The energy provided is sufficient, taking into account that the interaction of the W particle, an intermediate boson which respects the Bose-Einstein statistics, for weak nuclear interactions….

    It has never been seen online before. It is from private correspondence. Focardi was certainly a great original thinker, and one of the real pioneers of Ni/H LENR, along of course with Piantelli. But since Piantelli outlives him still, Focardi is somewhat overlooked.

    • Fedir Mykhaylov

      You draw the formula of neutron decay. Subsequent transformation if possible more.

      • Axil Axil

        Beta decay and reverse beta decay caused by the weak force is nothing new. This change of protons into neutrons and neutrons into protons happens when there are a very large mismatch between the number of protons and neutrons in the nucleus.

        The Widom-Larsen theory of LENR is based on this electron based proton conversion to the neutron.

        The meson driven theory is new and comes from my prediction and verified by Holmlid’s experiments.

        • Zephir

          /* The Widom-Larsen theory of LENR is based on this electron based proton conversion to the neutron */

          The capture of electron by proton is opposite process to proton splitting: the proton gains its mass during it and it grows. And neutron formed subsequently decays to a proton again, so that the energy yield of this process is disputable.

  • gdaigle

    Have you considered asking Rossi to include detectors in his upcoming QuarkX experiments that would detect if muons are being produced? Rossi says they are not being produced, but that comment may be based solely upon his proposed theoretical mechanism. I would like to see measurable data to back that up.

    • Axil Axil

      I have been begging MFMP to setup a cloud chamber to check for muons in their experiments. The H- theory of piantelli seems to have the favor of MFMP so a cloud chamber has little experimental value for them. The same request when out to me356.

      Rossi has to explain where all those electrons, a product of muon decay, he is seeing as electrostatic accumulations. Rossi extracts those electrons which comprise a large proportion of his total COP.

      Rossi also able to maximize electrical production at the expense of light and heat production. Now how can that happen?

      • Fedir Mykhaylov

        Electrons can be explained beta by the collapse of the short-lived isotopes such as nickel. The cloud chamber is absolutely necessary for the registration of such decay of low-energy neutrons.

        • Axil Axil

          Electrons would be seen if neutrons were converting into protons. Nut from the Lugano test, just the opposite was happening with the NI58, Ni60m Ni61 were converted to Ni62. More neutrons were being produced and no protons. That conversion cannot free up electons,

          • Fedir Mykhaylov

            Maybe in the annihilation of positrons further creation of an electron near the nucleus and not the visible gamma.

  • Axil Axil

    Transmutation of elements have be seen in raw electrical discharge experiments such as those from “Proton 21” and also those conducted by Ken Shoulders. Also, Ken shoulders has patented a process using electric discharge to stabilize atomic energy waste produces using electrical discharge.

    • Fedir Mykhaylov

      The experiments of the “Proton 21” are explained by the collapse In a joint article published Adamenko and Vysotsky.

      • Axil Axil

        Do you have a link, I would like to see it.

        • Fedir Mykhaylov

          Vysotskii V.I., Adamenko S.V. et al. Creating and using of superdense micro-beams of relativistic electrons. Nuclear Instruments and Methods in Physics Research. A455 (2000) pp.123-127.
          Adamenko S.V. et al. Effect of auto-focusing of the electron beam in the relativistic vacuum diode. Proceedings of the 1999 Particle Accelerator Conference, New york, 1999.

        • Fedir Mykhaylov

          V. I. Vysotskii, S. V. Adamenko, M. V. Vysotskyy. The formation of correlated states and the increase in barrier transparency at a low particle energy in nonstationary systems with damping and fluctuations. Journal of Experimental and Theoretical Physics, Volume 115, Issue 4, pp 551-566 (2012)
          V. I. Vysotskii, M. V. Vysotskyy, S. V. Adamenko. Formation and application of correlated states in nonstationary systems at low energies of interacting particles. Journal of Experimental and Theoretical Physics, Volume 114, Issue 2, pp 243-252 (2012)

          Controlled Nucleosynthesis. Breakthroughs in Experiment and Theory, Series: Fundamental Theories of Physics , Vol. 156, Adamenko, Stanislav; Selleri, Franco; Merwe, Alwyn van der (Eds.), 780 p. (Springer, 2007).
          S. V. Adamenko, V. I. Vysotskii. The conditions and realization of self-similar Coulomb collapse of condensed target and low-energy laboratory nucleosynthesis. 11th Int. Conference on Condensed Matter Nuclear Science, France, Marseilles (2004) Proceedings, p 505-520 (2006)

  • Axil Axil

    There are all sorts of nuclear processes going on in LENR making for a very chaotic situation. One of them may well be muon catalyzed fusion where muons interact with hydrogen.

    Another is the disintegration of the nucleus due to positive meson confinement in the nucleus because of the coulomb barrier. These positive meson infected nuclei break down and expel one or more alpha particles that carry away the decay energy of these decaying mesons. CR39 detection has shown that large numbers of alpha particles are produced by the LENR reaction. These alpha particles latter became helium when they combine with the electrons produced by negative muon decay.

    • Fedir Mykhaylov

      Muon catalysis is very good. Only need to show the mechanism for obtaining muons .

      • Axil Axil

        The mechanism is proton decay.

        • Zephir

          Complete BS. You’re just rising this concept, because you just read about it in context of some new experiments before few hours, which is your usual stimulation for enthusiastic pushing of whatever idea thinkable.

          Now you’re full of it – but you probably missed the detail, this decay was just dismissed by these experiments.

  • Private Citizen

    Mean this as a compliment, Axil. You really should write science fiction. You have a knack for quasi-plausible fringe science jargon.

    What, no tachyons or phlogiston this time? What about a super-symetric anti Higgs boson condensate?

    • Axil Axil

      My problem with explaining LENR is to keep it simple and get to and explain the key concepts. LENR is very complicated. There are a large number of enabling concepts that serve to amplify the production of EMF power density, but these issues only confuse things and most people get lost in the weeds. Sorry to have confused you with trivia.

      A criticism of this concept is that it would take a huge concentration of EMF power to effect the coupling constant of the fundamental forces. How this amplification takes place is very involved covering many fields of science.

  • Axil Axil

    MFMP may be seeing an active LENR reaction, but at a very low level. They have seen the production of x-rays in a smoother power distribution curve up to 1.4 MeV in a burst. The production of excess heat is to insensitive a probe to detect low level LENR activity. A better problem is the production of sub-atomic particles. If MFMP would look for muons, they would most probably see the beginning of muon emissions the the LENR reaction begins and marked by the production of x-rays.

  • Axil Axil

    There are two kinds of LENR, one that produces radiation and tritium and one that does not. What is the difference?

    There is a mismatch between the activity level driven by the coupling constant of the strong force as opposed to that of the weak force. As the EMF level increases, the strong force kicks in first before the weak force does. The LENR strong force based reaction begins and produces gamma radiation, neutrons, and tritium, but the weak force LENR reaction is still too enfeebled to thermalize the tritium and suppress the gamma and neutron radiation. As the EMF power density increases through a more vigorous dipole driving mechanism, a sufficiently high activity level is reached in which the weak force begins its radiation normalization function. When this critical transition level is reached, gamma is suppressed, neutrons go away, and tritium is instantaneously thermalized to He3. The x-ray spike that MFMP is seeing is the point were the strong force becomes strong enough to begin nuclear activity, but the weak force is not yet strong enough to mitigate the nuclear radiation being produced by the strong force.

    This mismatch between the strong and weak force coupling constant activity levels is evident in cold LENR systems…those systems that are not being driven hard enough to produce sufficient EMF density to activate the full LENR activity levels of both the strong and the weak force.

    • Mats002

      If this turns up to be true, you will be at the Nobel party Axil.

  • Axil Axil

    This theory makes predictions that are confirmed by many LENR experiments. It is mainstream standard model theory which is currently being checked out by a bevy of expensive big physics experiments. This mechanism is required to complete the standard model. Why do you think that it is untoward? What is the basis of your scepticism. I will respond.

    • cashmemorz

      The point of the criticism, as I understand it, is not that the theory has much wrong with it, as you put put it forward, but how it looks to imply “simplicity”. It would be true that the approach to LENR via GUT and the details of it as you lay them out are simple in terms of how the theory holds together. It is not that kind of simple that the average reader, not sufficiently cognizant of the way GUT works, sees. Without a deep background in physics the whole thing looks other than simple.

      • Axil Axil

        True, when you boil something down to the very heart of the cause, a cause that explains many mysterious and seemingly magical things, it might be considered simple and straightforward.

        One problem with the fusion meme is that it is impossible for mainstream science to accept. A GUT based theory has a chance to be accepted or at least will slow down the “its impossible” response. Many nuclear physicists won’t understand GUT so they might need to get educated before they can rightfully reject a GUT based LENR theory.

  • Axil Axil

    For the record, not that anybody is going to get anything out of it, but here is the types of sub-atomic particle reactions that are expected to occur when a proton or a neutron decays.


    The people that are trying to detect this type of delay are assuming certain sub-atomic particle will be produced.

    One caught my eye, two protons(diproton) decay to produce a tau particle, now that is weird.

  • Axil Axil

    You might ask: “I don’t see how introduction of proton decay with GeV-scale photons could help with explaining lack of MeV-scale photons?”

    Word Salad warning, you might not understand this.

    One of the formative mechanisms that drive LENR is Bose Condensation. The energy produced by nucleon decay is broken up by super-absorption. The gamma photon generated by the LENR reaction is partitioned into Square Root(N-1) where N is the number of Bose condinsate members. Bose condensation is a inherent component of the Surface Plasmon Polariton mechanism. In plain talking, every SPP forms a Bose condensate.


    The Bose condinsate is the mechanism that supports super radiation. This is a major EMF amplification mechanism.


    In quantum optics, when a magnetic beam interacts with a nucleus, all of the energy stored in the SPP Bose condinsate is directed into the impossibly thin beam. If there are a billion SPPs, then the magnetic power of the square root (of a billion SPPs) is directed at the nucleus. All energy transfer is done through entanglement. Even an an alpha particle produced by the LENR reaction trave; meters away before it hits something, that energy that is produced by the impact is redirected by to the Bose condinsate via entanglement.

    In the Holmlid experiment, all the millions of high energy alpha particles impacting reactor structure or walls are all sent back to the SPP Bose condensate. After all, Holmlid is still alive and going strong.

    • Rene

      reminds me of a solid state laser.

      • Axil Axil

        It might turn out to be a magnetic laser that concerts the spin of photons and projects all those spins as a thin laser like magnetic beam.

        This behavior is seen in the collapse of the cavitation bubble that can drill a hole in a diamond, Yes the polariton is produced by cavitation bubbles too.

    • Eyedoc

      So do you have any advise for experimentalists as to how most easily initiate this nucleon decay process ? What are most experiments missing to really get the reaction going strong? Does the Ni, Li, H method add up in this theory?

      • Axil Axil

        What is missing is the EMF stimulation, and there is the Rydberg matter part to consider. LENR is a very complicated thing with many parts.

        • Eyedoc

          Is there a particular type/method/intensity of EMF stimulation that would fit best with this GUT ?

          • Axil Axil
          • Eyedoc

            thanks for the link……but I don’t see any EM reference to nucleon decay or muons at that time…… that’s why I ask if there is more specifics from this GUT idea.

          • Axil Axil

            We know that Rossi uses National Instruments controllers from the IH licence agreement, I would advise the LENR replicator to get a RF signal generator


            And systematically feed RF signals into the reactor until good things happen.

          • Mats002

            It’s not as simple as feeding a frequency into a wire.

            RF signals can be guided or transmitted, and transmission is made via antenna and they can be focused or not.

            Guides and antennas must be compatible with the RF frequency.

            I see the possibility to control the RF signal to feed over a wide range of frequencies, but how to guide or transmit it into the LENR device? Computer controlled material morphism?

          • Axil Axil

            All that is the job of the experimentalists.

      • Axil Axil

        Letts and Cravens found a number of LENR active resonances by using two lasers that produced beats through interference. they found there was a number of LENR stimulation frequencies in the terahertz region. By using a square wave, those high frequencies could be produced by wave interference. That is why many successful replications are produced by triacs and pulsed power stimulators like Brillouin Energy Corp.

      • Axil Axil

        See new post above.

    • Roland

      The non-theoretical, i.e. created repeatedly in laboratories and their properties probed by various means, Bose-Einstien Condensates (BECs) share a necessary threshold condition, they form at > 0.1uK to 1nK; i.e. down to a billionth of a degree Kelvin above the absolute zero where motion, theoretically, utterly ceases.

      In BECs the now aggregate atoms respond as a unitary whole to stimuli, as one ‘thing’, no matter where in the physical volume the stimuli is applied. In BECs the atoms are motionless relative to each other and light (photons) crawls across them as slowly as 7 meters/second.

      The collective BEC properties observed to date are ’emergent’ from a phase of matter that was first created in 1995; general weirdness, like global entanglement, prevails and there’s lots of head scratching and experimentation ahead to get to a comprehensive understanding of the phenomena (it did take 70 years just to get from the proposed theory to the first verified lab result).

      One thing does seems fairly certain; BECs are a 5th state of matter that phase change back into an ordinary bunch of unruly atoms with truly minuscule energy inputs.

      BECs run at >.000,000,000,01K.

      Quarks run at 1843K.

      This does not defeat the proposition that emergent properties particular to a specific phase of matter are intrinsic to LENR, it does, however, strongly suggest that that phase of matter is not a BEC.

      • Axil Axil

        An inherent characteristic of the Surface Plasmon Polariton (SPP) is that it is a BEC


        If a SPP can form, the SPP will form a BEC no matter how high the temperature is. A dusty plasma will produce SPPs among the dust particles.

        Referenced below, G. Egely explains how SPPs have produced LENR systems for the last 100 years.


        SPP are the door to creating BECs up to 50,000K and maybe far higher

        The blue light that Rossi produces is the same light as sonoluminescence demonstrates and they both come out of the high-energy side-peak emission described in the first referenced phys.org article.

        If you get interested in SPPs we will discuss that subject some more…

        • Roland

          Hi Axil,

          I actually did some reading on Polaritons and the contemporary state of the research when you first posted this thread, yes they’re interesting, no they don’t appear to be BECs and the serious experimentalists I’ve found make no such claims. They do note interesting emergent properties.

          I also went to the Egely link you provided above.

          Egely makes huge assumptive leaps with, in my view, little regard for validating the rational for having done so other than that he finds it appealing in the context of his thesis. The article is also chock full of qualifier words like may, might and could where the argument he’s making requires substance to sustain it, such as data and hard fact and proven theory validated by rigorous experimentation.

          To my eye the quality of his thinking is reflected in hazy language and a wishful connection of essentially disparate phenomena.

          Non the less the emergent properties associated with the known phases of matter continue to suggest that the emergent properties associated with LENR in general and E-cats in particular, such as isotopic transmutation at radically lower threshold energies, are indicative of yet another phase of matter with emergent properties that allow this.

          The guy with the preponderance of evidence supportive of a particular interpretation of the data is, fortunately, quite inventive and the truth will also emerge over time.

  • Bruce Williams

    I find your post rather unhelpful ; Axil has been one of the most interesting & helpful contributors to this site by a long way ( I confess I have great difficulty in understanding what he says)……..If you can contribute at the same level, you will be a hero too !

  • clovis ray

    fishing trip, can i go. lol, it seems this whole experience has been a fishing trip, lol, no offence here either.

  • Axil Axil

    Yes, I am talking about what causes LENR, how it works, and why the experiments behave the way they do,

    • Fedir Mykhaylov

      Axil by combining together Bose condensate, Rydberg matter, muon catalysis, plasmons and other begins nauseated

  • Mats002

    Ok, this is plausible in the context of observed phenomena of PdD, NiH, Mills, Holmlid and the latest QM and material science findings.

    Having a phenomena of radiation amplification for a few milliseconds is one thing, but having it burst and rebuild again and again without consuming or degrading the NAE (the environment that makes it tick) is another.

    How do that?

    • Mats002

      Edit: for a few picoseconds or even femtoseconds. This phenomena must be very short lived at GeV / MeV. The higher energies the shorter timeframe which equals higher frequency.

      The iterative-without-choking energy producing mechanism must be explained for the whole scope of objects in your study.

      • Mats002

        Edit again: the decay schema might be a prolonged phenomena for milliseconds but not the initial focused beam that shatters the nuclei.

        What are the timeframes? Sources?

    • Axil Axil

      The SPP is a BEC and the BEC remains in effect even when the individual SPPs BEC members die off. The energy is retained in the BEC and gradually released as hawking’s radiation and light(blue light as per Rossi) while most energy is stored long term in the BEC.

      Think of the BEC as your body. You live for years but many of your individual cells die and are replaced by other cells. That is how the LENR reaction can stay active indefinitely.

  • Zephir

    Axil is twaddling a lot again about things which he doesn’t understand – even at the solely shallow conceptual basis. Why just the people who have nothing to say are these more verbose ones? The absence of proton decay confirms Standard Model, which doesn’t allow LENR and it therefore strengthens the impossibility of LENR. In addition, the fast running process like the LENR cannot depend on extremely low numbers like the 10^34-year lifetime of proton.

    • Chapman

      Hello ZEPHIR!!!

      I was waiting/hoping for you to join in on this one. There is no shortage of experts on Mainstream Science and the Standard Model itself, but when stepping off the golden path of academic conformity one needs a guide who is well versed on all the twisting paths leading through the maze of alternative theories.

      Do you find ANY of this plausible? Is there a Golden Nugget here, or just a bunch of gravel?

      (P.S. Beware – These guys have no sense of humor…)

      • Zephir

        The cold fusion is a consequence of multiple contributing factors, but the dominant one is driven with quite classical physics and it utilizes the low-dimensional character of lattice collisions. The cold fusion arising from free collisions in plasma cloud is extremely improbable from thermodynamic perspective, because the simultaneous collision of many atoms (along single line) is also extremely improbable inside the plasma gas. Not so inside the well arranged crystal lattice…


        There is no need to raise exotic extensions of GUT or Standard Model – especially not in the moment, when they were (again) falsified by (absence) of proton decay.

        • Mats002

          Hi Zephir, I read your linked article and see simularities with Me356 recent experiment where he reported Li visible to the NiH reaction gave nuclear effects.

          You say Li must be molten and only a few degrees of temp window to get the reaction. May I ask from where are your sources?

          Also: If NiH effect is phonon driven – what frequency(ies) would you suggest make most bang?

          • Zephir

            https://www.facebook.com/chemonuclearfusionproject http://newenergytimes.com/v2/library/2006/2006Ikegami-Ultradense-Nuclear-Fusion-ER2006-42W.pdf http://www.roxit.ax/CN.pdf http://www.lenr-canr.org/acrobat/MinariTexperiment.pdf http://www.diva-portal.org/smash/get/diva2:52651/FULLTEXT01.pdf

            /* If NiH effect is phonon driven – what frequency(ies) would you suggest make most bang */

            Did I say, it’s phonon driven? I don’t remember it… I did talk about
            low-dimensional collisions. There is Znidarsic theory, according to
            which the surface and bulk of atom orbitals must get into a mutual
            resonance and the resulting resonating frequency predicted is in range of few THz. IMO this frequency is too high, because what resonates is not surface of single atom orbital, but multiple entangled ones, so that the resonating frequency gets lower.

          • Eyedoc

            OT…Frank , any chance you and Zephir can put together a separate page on his understanding of LENR ….. He seems to have as complete a picture as I’ve seen yet. Thanks

          • Zephir

            If Mr. Acland will help me with translation of my Czenglish into a more widely recognized language, I could write a broader picture of cold fusion as I understand it by now.

            My basic picture is, the LENR is not a single well defined nuclear process, but mechanism of acceleration of whole spectrum of nuclear reactions, which are already known from colliders and nuclear reactors. The LENR can accelerate the true fusion (like the 3Li(7) + p > 4Be(8)* > 2 He(4)) in the same way, like the decay of 4Be(7) + e → 3Li(7) + ν or electron capture in potassium. Once it gets involved, it does favor the nuclear transmutations into account of free particle formation, because the energy gets released in form of heat instead of fast moving fragments. In this way it’s analogous to catalyst, which also promote the formation of less equilibrium i.e. negentropic mixture of products.

            For example, if we heat the water, it will decompose into hydrogen and oxygen. But if we use a catalyst, then the oxygen gets released in form of hydrogen peroxide, i.e. more complex product, than the water itself. How we can achieve it? Well, we must compress the water with radiowaves into form of water clusters, which will subsequently released fast, so that thermodynamically metastable mixture of hydrogen and hydrogen peroxide will remain preserved. And similar process applies during LENR: multiple atom nuclei compressed into single one will balance their energy content smoothly and when this metastable system gets released again, the resulting energy is distributed across many nuclei, so no tiny fragments are released into outside.

            My point is, such a temporal condensation of atom nuclei occurs during low-dimensional collisions, when multiple atom nuclei collide against each other along long stacks, which is just enabled by perfect lining of atoms within metal lattices. The thermodynamics get broken the more, the more distant the system is from random arrangement. Just the factor of geometric regularity is what violates the thermodynamical probability of Coulombic barrier breaking, which is based on solely random arrangement of particles instead.

          • Chapman

            A question for Zephir,

            By any chance, is Electronics one of your fields? I have a thought regarding Rossi’s electricity harvesting, and I am looking for someone with electronics experience to bounce it off of – kinda like an “ECW Peer-Review” thing.

  • Roland

    It is possible that Frank doesn’t editorialize the provided titles and, knowing Axil, there may be some irony, or simply hopefulness, involved.

    The simple part is that neutrons are being added to lithium and nickel atoms as evidenced by the isotopic transmutations seen in the analysis of the post-run Lugano ‘fuel’.

    The complicated part is arriving at the correct explanation of how this otherwise inexplicable event has occurred.

    Axil has offered a succession of hypothesizes for consideration, we think about them if for no other reason than that the passion and sincere desire to understand is self evident, and that they’re evocative in some subtle sense as though they might be meaningful and ultimately helpful in some way.

    Never a dull moment…

  • Roland

    The biggest problem with a proton decay model is the lack of a great smoking hole in the ground given the energies involved; kind of reminiscent of the minority position in the betting pool at Trinity…

    • Axil Axil

      There is a range of power production here that varies gradually from slight to heavy based on quantum mechanical probability. And don’t forget those reactor meltdowns where stainless steel vaporizes and sciphire forms from alumina.

  • Axil Axil

    From an engineering standpoint, the QuarkX is a radical but highly plausible advancement: the very purpose of the miniaturization was to decrease the thermal inertia of the reactor as much as possible, thus enabling much more precise control with greatly reduced response time, which allowed Rossi to raise the COP by two orders of magnitude.

    I beleive that we might have hit on a gem of truth here. Heat is both a reaction stimulator and a energy output of the reaction. If Rossi minimized the stimulation by heat in his design then the EMF stimulation could provide positive control without interference from an output product provided by heat. A Reactor run away is caused when the output of heat also becomes a stimulator of the reaction.

    Something that Rossi did in the design of the Hot Cat made that type of reactor less reactive to the heat that it produced and more dependent of EMF stimulation. It might be the very high temperatures that the Hot Cat and Quark operate at. That high temperature may be out of the reactor’s heat stimulation zone.

    Someone should ask Rossi is he turns off the EMF stimulation, if the Quark would turn off with no heat after death. He has said that the Quark is always in SSM mode, and that is already telling.

    • Rene

      A correction: Rossi said the Quark is never in SSM, and that makes sense given past problems. SSM appears to be a complex dance to keep a want-to-be exponential reaction stay linear.
      My guess is that the design of the Quark, with its very small mass, lets the reaction go exponential, and that some intrinsic property (maybe it is a strong terrawave reflection backfeed) quenches the reaction. I don’t believe the Quark is running continuously but instead it is a series of very short duration reactions happening 50/60 times a second.

  • Zephir

    /* Basically, the lattice physically shepherds the constituent particles to movement along a constricted plane? */

    Not just plane but single 1D line – which is something, which cannot really happen inside the chaotic plasma – no matter how dense and hot it is: due to random distribution of particle location and momentum the probability of coincident collision of multiple atoms at the same moment gets extremely rare. Which is why the tokamak is just mindless waste of energy.

    Inside the random plasma we have few options, how to increase the probability of collinear collisions of particles: for example the fast expansion of gas through narrow nozzle into a vacuum, which eliminates turbulence (some XUL lasers are already using this option). Or by their acceleration with linear voltage gradient. Or – as Holmlid is already using – their excitation with beam of laser.

    The laser beam is perfectly collinear by itself, so that the particles get oriented their momentum, despite their location still remains random. From this reason the laser activation of fusion requires considerably higher energy densities and it also leaves higher amount of ionizing particles, which is also waste of energy.

    From this reason the Holmlid’s way of fusion has nothing very much with both cold fusion within metal lattice, both with laser fusion as practiced by NIF or Nova experiments, because these arrangements utilize the inertial confinement of plasma, i.e. its fast 3D compression by gold target called a hohlraum, which leads into chaotic 3D particle collisions again. And the NIF does very best for to make the collisions 3D by arranging the beams from all sides at the same moment – this is reliable way, how to kill the laser initiated fusion instead.

    For cold fusion the low dimensionality of collisions is simply the key of success.

  • Zephir

    /* This is close to Widom-Larson hypostulatates, if I am correct? */

    Could you be more specific in this case instead of me? I’m indeed interested about all connection points of existing theories of cold fusion.

    There are two ways of achieving high energy density: we can collide the atoms randomly and increase their speed during it, which is wasteful approach.

    Or we can collide long chains of atoms in such a way, they will get stucked/jammed along a single line in the moment of collision. In this moment the long stack of atoms serves as a piston or needle and whole the combined momentum of many atoms get concentrated into a very small volume, so that the Coulomb barrier gets literally pierced there.

    I.e. we aren’t increasing the energy of atoms, but their count and combined inertia at the moment of collision.

  • sam
  • Axil Axil

    New post on EMF stimulation.


    As follow:

    One of the mechanisms that seems to work in stimulating the LENR reaction is EMF stimulation. There is a number of threads of research that might be tied together to get a handle on what could be the character and structure of this stimulation.

    One of this treads is the Superwave patented in 2003 by Energetics Technologies, L.L.C.


    The patent depicts the Superwave waveform as follows:


    This waveform seems to help in starting the LENR reaction in electrolytic based systems.

    It looks like a fractal based waveform with the primary wave repeated in superposition as a half or quarter wave of the primary wave in iterative fashion.

    But what is the goal of the Superwave? I believe that the goal is to get to a very high frequency in the terahertz range using simple electronics in order to drive the dipole motion of the plasmons that are the fundamental power source of the LENR reaction through resonance.

    Letts and Cravens found a number of LENR active resonances by using two lasers that produced beats through interference. They found there was a number of LENR stimulation frequencies in the terahertz region. In THz, 8.4, 14.5, 14.75, 15.3 and a broad resonance peak at 20 and 21.4.

    By using the Superwave format, the goal is to get to one of these LENR active resonances at the smallest fractal resolution.

    To build the our superwave, lets choose the 20 THz frequency as the smallest waveform in our compound superwave formate.

    We use the half wave fractal pattern, that means that the sign waves that form the superwave go like this

    20, 10, 5, 2.5, 1.25, .625, 0.3125, 0.15625, 78.125 GHz, 39.0625 and so on until we get down to a low frequency or 20 harmonics. This Superwave form will resolve to a square wave or a saw tooth wave.

    The harmonics that Rossi used in the Lugano test looked like a combined half wave and quarter wave mix.

    The Lugano repost said that “The figure reveals that all the most important harmonics are contained within the 20th harmonic, and, therefore, that all the wave shape harmonics input to the system lie within the PCE’s measuring range”

    • Zephir

      The scalar wave mechanism of cold fusion utilizes frequencies in radiowave (molecular), megahertz (orbital) and terrahertz (nuclear transitions) spectrum. During cold fusion we need them all: the radio frequencies for phonon wave induced collisions of atoms each other within atom lattice, the TV frequencies for resonance of transverse and longitudinal Rydberg waves spreading along individual orbitals and the terrahertz frequencies for inducing resonance of orbitals withing atom nuclei. All these fractal pieces of matter must vibrate wildly in unison for to raise the probability of their mutual collisions followed by multiplication of momentum (Astroblaster ball effect).


      Practically we just need high frequency discharge with high portion of harmonic frequencies (which occur in rectangular pulses). Such a high harmonics naturally emerge during fast interruption of current for example during plasma electrolysis, the boiling vapor is interrupting and chopping current in very irregular way. Scalar waves emerge when the current gets interrupted fast, not just alternated fast.

      IMO it’s worth to recall in this connection, Nicola Tesla achieved very high frequency pulses with using of magnetic interrupter: the magnetic field makes the discharge unstable and it decays faster. Most of scalar wave phenomena he observed with this arrangement. http://beforeitsnews.com/free-energy/2011/04/tesla-coils-unleash-the-aether-576485.html


  • Axil Axil

    The BEC mechanism in the polariton is the dipole motion that is ENTANGLED with the photons in the photon based soliton. Shared EMF synchronization between dipoles are the mechanism that enforces high temperature BEC in polaritons.



    for an example of high temperature BEC in polaritons.

  • Axil Axil

    Regarding: Hydronion, or Hyd (it is actually a nucleus that hides …) and any other hydrogen particle based theory.

    The issue with hydrogen based new composite particles is that there are instances of LENR and transmutation that act without hydrogen being present. Pure electrical discharge as per Proton 21 and Ken Shoulders are examples.

    These spark based LENR reactions happen at many thousands of degrees centigrade in a dusty plasma.

  • Axil Axil

    For awhile now, I have thought that R. Mills has not seen the whole picture. This concentration on”effect” has stopped Mills from producing useful results for the last 20 years. Viewing effect as cause is pervasive in LENR. Mills is a good experimentalist. Looking at his data and ideas just requires selectivity.

    What is confused in LENR is cause and effect. Oftentimes, effects are confused as being the cause of LENR but in fact the effect is not the cause. For example, this night be going on with the hydrino. In more detail, what produces the hydrino?

    In the Fractional quantum Hall effect


    Hall conductance shows a factional chance as a result of a change in a strong magnetic field. The reason for this is the creation of the composite fermion, a electron/magnetic quasiparticle. The magnetic field produces quasiparticles that change the nature of the electron.

    A strong EMF field could be changing the nature of the electrons in the presence of a catalyst that is producing a magnetic effect to form hydrinos which might be composite fermions.


    The magnetic flux quanta could result in a modification of electron orbitals as Mills observes. But the cause is an applied magnetic field, the hydrino is the effect of that magnetic field.

    If the magnetic field is the true cause of gainful energy production and the hydrino also appears as a result of the magnetic field, it is possible that the the hydrino is mistakenly assumed to be the cause of gainful energy production. But the real cause that works at a deeper and irreducible fundamental level is the applied magnetic field.

  • Roland

    How about if we start with a fresh metaphor.

    The current metaphor parallels human experience with uranium and plutonium fission.

    Humans make a truly massive effort (WWII) to find and define the threshold conditions that will allow the fission reaction to begin; and when it finally does start it wants to keep right on going till it melts down the apparatus, then we figure out how to make it stop.

    When a fission reactor threatens to go exponential ya gotta soak up neutrons in a hurry, or run.

    The crew that watched a 300 lb. stainless steel Dewar flask turn into a puddle on the floor all by itself (loaded with LENR fuel but no stimulus applied yet) hastily exited the lab. In a decision that still confounds me, given the sacrifices others have been willing to make in pursuit of knowledge, they didn’t go ‘that was incredible, lets figure out what just happened and make it happen again’, fear prevailed and they stopped in their tracks

    The majority of successful replicators since P & F have experienced a melt down or two, some of which have been completely inexplicable.

    So, on the surface of things there appear to be excellent reasons to apply the fission metaphor to hydrogen in metal lattice LENR reactions.

    There is, however, another fission reactor design that may make for a more useful metaphor when considering the peculiarities of the Quark.

    A thorium reactor remains relatively inert no matter how much thorium you physically assemble, there is no mass threshold, such as with plutonium, where simply crossing the mass threshold will result in a nuclear explosion. A thorium reactor is driven by an external source of neutrons; neutrons ‘on’ the reactor starts, neutrons ‘off’ the reactor stops, feed it more neutrons it goes ‘faster’.

    Failsafe; good engineering…

    The driver, Rossi’s term, is external to the Quark; turn on the driver the Quark starts in seconds, turn off the driver the Quark stops in seconds, turn ‘up’ the driver and the Quark runs ‘faster’. Failsafe.

    This being LENR, things then immediately become more complex; the driver consumes .5w as long as the Quark is on.

    No matter whether the output is 100w or 10w the average input power remains the same while it’s the COP that goes up and down.

    The Quark’s output can be tuned to produce two distinct modalities; infrared photons, visible photons, possibly more energetic photons and electrons. The driver can modulate the Quark’s output between various percentages of energetic photons and electrons without altering the average input power of the driver.

    Thorium reactors respond linearly to input power, Quarks respond non-linearly to input power.

    Thorium reactors are driven by something dead simple.

    Quarks are driven by ‘something’ subtle and complex.

  • Zephir

    I don’t understand how hydrino could generate the energy and to remain unnoticed in nature, being the most stable form of hydrogen in this way. The most stable form of matter is simply the ash of every reaction and it should occur everywhere…

    IMO the fundamental quantum state is the most stable form of matter instead due to omnipresent vacuum fluctuations, which keep everything in motion. If some subquantum states could exist by some miracle (Mills is saying, the spherical orbitals radiate poorly the energy, so that they’re espically prone to forbidden electron transitions), then they must be very unstable and as such endothermic: their formation would consume energy instead of generate it. They would be interesting from theoretical perspective – but practical?

    If I should take Mill’s experiments seriously, I’m forced to consider some fusion mechanism behind it instead of hydrino.

  • Roland

    Mills predicted the existence of dark matter based on his GUT-CP over a decade before the data forced consideration of this otherwise unforeseen but now widely accepted cosmological description.

    In Mills’ view entropy rendered the vast majority of hydrogen atoms into hydrinos aeons ago and they are that dark matter.

    Hydrinos revert to hydrogen when external energies cause that transformation; i.e. a photon of sufficient energy impacts the geometrical electron and ‘lifts’ it back up to the lowest energy state described for it in quantum theory. Outside the occurrence of such an event hydrinos are largely inert, they don’t form molecules or release any detectable energies.

    Dark matter was detected by its gravitational effects.

    In the low energy environment of interstellar space hydrino to hydrogen transformations appear to be a relatively rare event, most of the mass of the universe is ‘dark’ and, according to Mills, is likely to remain that way.

    I suspect Mills would have a much wider following if he was a gifted lecturer, unfortunately this is not the case as he presents as simultaneously wooden and arrogant on film.

  • Axil Axil

    By Ethan Siegel

    There’s a beautiful, elegant idea that’s out there in physics: that everything we see, perceive, and interact with in this Universe is just a different manifestation of the same fundamental force in some way. Advances towards this end have appeared before: the discovery that the scores of different atoms were all made of protons, neutrons and electrons; the discovery that just four fundamental forces (the gravitational, electromagnetic and strong and weak nuclear forces) were behind every single phenomenon in the Universe; the further discovery that a single equation (the Standard Model Lagrangian) perfectly described three of them, and even unified two of them — the electromagnetic and the weak force — into a single force: the electroweak force. Could there be a single, unified force that all the different forces are just different manifestations of?

    Unification was originally a dream of Einstein’s, among others. Maxwell had unified the phenomena of electricity and magnetism into a single one (electromagnetism), and there was hope that there may be an even more fundamental conception than that. Back when there were only two known forces, General Relativity (gravity) and Maxwell’s equations (electromagnetism), unifying them into a single, classical framework was the goal of a great many top theorists of the day. For a time, it seemed that nature was getting simpler and headed towards fewer — not more — fundamental components to the Universe. Yet in quick succession in the 1920s, 30s, 40s and 50s that began to fall apart:

    New subatomic particles, the muon, the neutrino and a whole slew of mesons began to be discovered.

    • Quantum mechanics, radioactivity and nuclear fusion and fission brought not one but two new fundamental forces: the weak and strong nuclear forces.

    • And deep inelastic scattering experiments began to reveal that even protons and neutrons has component structure to them: the quarks and gluons.

    By the end of the 1960s, it had become clear that there were dozens of fundamental particles, governed by four independent forces that were quite distinct from one another.

    At very high energies, however, around ~100 GeV (or approximately 1013 times the ambient energy at room temperature), the weak nuclear force and the electromagnetic force quite clearly become two different manifestations of the same fundamental force. You might ask, then, if it’s possible that at even higher energies, the other forces unify? The first one to consider is the strong nuclear force, since it’s also a part of the Standard Model like the electromagnetic and the weak force. There are a few facts that seem to support this idea:

    • The charges of the proton (governed by the strong force) and the electron (governed by the electromagnetic) cancel exactly, hinting that there might be some symmetry there.

    • The coupling constants for the strong, weak and electromagnetic forces, which change as a function of energy, almost meet at one single, high-energy point if you extrapolate to higher energies.

    • And the additional physics that this unification brings along with it allows potential solutions to problems like why neutrinos have small-but-nonzero masses and why the Universe has a matter-antimatter asymmetry.

    It’s an incredible, tantalizing idea. In fact, before string theory was the major theoretical game in town, grand unification and grand unified theories (GUTs) were all the rage. But there are some big problems with these ideas, too. For one, the new particles that were predicted were of hopelessly high energies: around 1015 to 1016 GeV, or trillions of times the energies the LHC produces. For another, almost all of the GUTs you can design lead to particles undergoing flavor-changing-neutral-currents, which are certain types of decays forbidden in the Standard Model and never observed in nature. Another prediction of almost all GUTs is the existence of proton decay, on timescales of around ~1030 years. You might think, since our Universe is only around 14 billion years old, this isn’t a concern. But if you can get ~1030 protons together and wait one year, you should see a decay, because decays work probabilistically.

    Detectors like Kamiokande and its successors are sensitive to this exact type of decay, and we fill them with water (containing two protons in the form of hydrogen atoms for every molecule) and wait. We’ve determined, experimentally, that if the proton does decay, it has a lifetime of at least ~1035 years, meaning that most GUTs — including the simplest one — are ruled out. And the story gets worse from there, if you take a skeptical look at the facts. The single “point” that the three forces almost meet at only looks like a point on a logarithmic scale, when you zoom out. But so do any three mutually non-parallel lines; you can try it for yourself by drawing three line segments, extending them in both directions until they all intersect and then zooming out. The small-but-nonzero masses for neutrinos can be explained by any see-saw mechanism and/or by the MNS matrix; there’s nothing special about the one arising from GUTs. And the explanation for the matter-antimatter asymmetry would result in an overproduction of magnetic monopoles as well, which are not observed to exist in our Universe.

    It may still turn out that grand unification is correct, and that it’s an important step on the road to a Theory Of Everything: the ultimate holy grail of many theoretical physicists. But it may also turn out that nature doesn’t unify at high energies, and that our biases towards simplicity, elegance and more unification is completely wrongheaded and has nothing to do with our physical Universe. In science, as in all things, we cannot afford to be driven by our own preconceptions of how things ought to be. Rather, we owe ourselves to view the Universe exactly as it is, and to listen to the story it tells us about itself. It might not be comforting, especially initially, but other than the motivation brought on by electric charges being the same across quarks, leptons and bosons, there’s no compelling reason to think grand unification is anything other than a theoretical curiosity and a physical dead-end.

    • Fedir Mykhaylov

      Aksil. Are you familiar with the hypothesis of Mr. Ratis of neutrons and dineutron? Deuteron bound state of two neutrons and neutrinos

  • Roland

    To the degree that predictions based on theory have subsequently been experimentally or observationally validated Mills must be taken seriously, whatever eventual shortcomings of his GUT-CP time might reveal, as any review of his predictions, based on his theory, that have been testable to date have proven theory to be correct.

    In many ways the most meaningful criticism of his GUT-CP has been the lack of a viable explanation for the multiple laboratory validations of LENR.

    Irony abounds.

  • Roland

    There was a central philosophical debate in the early 20th century about the impact of a process they named reification on our perceptions and thought processes.

    The answer was summarized thusly.

    Existence precedes essence.

    Reification describes the subjective process of ‘thingifying’.

    A map is not the landscape.

    Our culture makes symbol using creatures of us; we imbibe languages, belief structures, reality descriptors, rituals of belonging and the unconscious memes of theology, myth and family with every breath.

    In Newton’s universe the universe was assumed to follow Aristotle’s dictate that the world is rational and therefore amenable to dissection with the pure intellectual tools of symbology alone.

    To the best of my knowledge there is no sphere of human endeavour less suited to the strictures of reification than sub-atomic ‘particles’; for example the term Higgs Bosun reifies a field effect that, probably, pervades the entire cosmos.

    At a deeper level high energy physics has successfully reified the Higgs field by inducing it to express as a bosun; yet to mistake this bosun for the field is to fundamentally misapprehend the enormous implications of the existence of the field in the temporary clamour of producing, apparently, a few thingy bosuns.

    The Higgs (specifically) field imparts the emergent property of mass to all the ‘massy particles’ in the ‘known’ universe.

    Run the math on the implied energies of a universal Higgs field.

    None of this, of course, specifically addresses the ‘mechanism’ of neutron decay that you’ve raised, rather I hope it serves to remind that on the back of a decidedly rational attack on problems the answer often arrives as insight.

    Thereby, existence precedes essence.

    I would posit that the only reason this work at all is because we are already that.

    Good luck.

  • Axil Axil

    Maybe Mills will get this latest work to reality if he stays true to truth.

    When Papp contested in court with Feynman for killing people at Papp’s demo, dear Richard lost real bad though ignorance and arrogance…figure that and draw well some lessons.

  • Roland

    No and no…

    The muses are rallying to a goodly cause and the pen has its, unexpected, uses.

    The pleasure is mutual; which is why I keep engaging your active intellect.

    I am also, of course, engaging you to future, still unknowable, purpose.

  • Zephir

    Thank you for your heartwarming feedback indeed.

  • Zephir

    Yes – we have lotta iron and nickel in the universe, because these elements are product of many exothermic nuclear reactions – but no hydrino meteorites are still raining on our heads. Which is strange.

  • Axil Axil

    A Bose condinsate at any temperature might be formed through synchronization of dipole motion on particles. Low temperatures are not required to get all the bosons vibrating in sync – like a laser.

  • Axil Axil
  • sam

    Found this on Physics World blog.

    Jelle Boersma
    Nov 9, 2008 at 4:23 am
    Perhaps the following theoretical idea related to how cold fusion may occur
    is highly naive, but since I could not find any documentation discussing this the
    only way to find out is to bring it up, Thanks for any feedback.
    Could it be that cold-fusion is facilitated by quantum-entanglement of two
    deuterium nuclei with the external degrees of freedom (eg the phonons
    in the metal-lattice in which the D-nuclei are dissolved).
    The idea is that if two D-nuclei are entangled with independent exterior degrees
    of freedom then the density matrix describing the subsystem of two D-nuclei
    will be diagonal and the interaction hamiltonian vanishes even when the two
    nuclei have overlapping wave-functions in space and time .
    In other words, the electrostatic repulsion between two D-nuclei could be temporarily neutralized, along with the weak and the strong interactions.
    For two decohered nuclei at the same location it would take recoherence (through alignment of the exterior degrees of freedom with which the two nuclei are entangled) to restore the interactions. If the two nuclei recohere with sufficient overlap so that the strong attraction exceeds the electrostatic repulsion then fusion may occur.

  • sam

    From Ego Out blog

    AxilApril 1, 2017 at 12:53 PM
    Rossi et al are confusing cause and effect. The strong and the weak force produce nuclear change and the subatomic particles are the effects of how those forces function. The strong and the weak force produce the pion, muons, and mesons that Rossi is now factoring into his theory. But these particles are just the effects of what the strong force is doing in LENR. LENR is a condition where the strong force changes the way it behaves. The particles are the results of this change in behavior.

    Professional science states the the fundamental forces of nature cannot change unless they are affected by the application of extremes in energy. If enough energy is present, then the fundamental forces will gradually become unified. This is the main tenet in supersymmetry. This misconception is where science is going wrong in their understanding of reality. The action of the fundamental forces can be changed by special very low energy electromagnetic formating.

    As witnessed by LENR, the fundamental forces do not behave in this high energy driven way. As Rossi states, these forces change when a special type of magnetism is applied to the fundamental forces of nature. Rossi has picked the quadrupole magnetic force as the factor that changes the action of the fundamental forces. This pick is wrong. Informed by other LENR experimentation, we know that the proper LENR active magnetic force format is the monopole magnetic force.

    But even with this small bit of theoretical misdirection, we must give him his due. Rossi is very close to having LENR theory correct in its most basic aspects.