Charting the Nickel-Lithium-Hydrogen Workspace; 3,600 Variables Involved (

Alan Smith of has posted a short slideshow on the LFH website which shows the many variables involved in any possible nickel-lithium-hydrogen LENR reaction. The full document can be read here:

Alan provides a list of some of the important variables in the categories of Reactor Parameters, Fuel and Pre-treatment, and Heat and Power, and calculates that there are around 3,600 variables involved in the build and experimentation of a Rossi-type reactor, making it a hard task to hit the sweet spot. He writes:

Thus it becomes obvious that without the luxury of a big team, a dedicated workspace and large
budget -the nuclear physics equivalent an infinite number of monkeys with typewriters – the
successful creation of a working LENR system – involves attempting to minimise the unknowns
by careful study of the available patents and papers to create a practical and logical series of
experiments where only one parameter is varied each time. Careful record keeping and data
Logging thus becomes a ‘must do’.

We have seen now many E-Cat replication attempts turn up with either marginal or no excess heat at all — illustrating the point that Alan is making. On the other hand there have been some experimenters who have reported clear success in obtaining excess heat in some experiments (e.g. Parkhomov, Songsheng Jiang, Stepanov), which give an indication that it is not an impossible task. But Alan’s point regarding consistent, systematic experimentation with careful record keeping is critical if we are to narrow down the parameters for success. It’s a big task and not easy to do with a dispersed community of replicators speaking different languages and with different motivations. Is it possible to have the coordination and cooperation needed for open-source researchers to crack the code behind the Rossi effect?–THE-EXPERIMENTAL-SPACE.pdf

  • Zephir

    Work smarter not harder – and listen those, who understand the subject.

  • Ophelia Rump

    That seems irrefutable the way you put it, until you consider that the field would not exist today were it not for the good old fashioned way. Using predictive methods to find something which your methods say does not exist, is quite possibly an insurmountable variable all in itself.

  • AdrianAshfield

    What you suggest only works if you have a reasonable understanding of the process. While there are dozens of theories none has yet been accepted. The process may be based on a new phenomenon.

    The danger of believing in models when you don’t have the facts is well illustrated by the IPCC computer models of the climate. Many are adamant the “scientific” models MUST be right even when they are not supported by observations.

  • SG

    You make some good points. Bear in mind that many if not most LENR supporters and experimentalists hail from the computer and electrical engineering fields, much more so than the field of physics. The physicists as a whole have largely turned their back on this phenomena. I think initial Edisonian type experiments are necessary until at least a rudimentary theory can be developed. Then, I believe you are right, computer modeling will be way the forward to optimize the effect.

    • Alan DeAngelis

      Some of these optimization techniques might be useful.
      But the standard operating procedure is for the “brilliant” to “borrow” the optimum conditions that the “idiot” came up with and create an

      ipso facto, supercomputer model of how they “predicted” this optimum.

    • Roland

      It would be of tremendous service, granting that your self description is reasonable accurate, to us E-catworlders if you would take the time to examine a particular software suite for modelling molecules; it’s available, in late beta, as a free download in the hopes engaging relevantly skilled individuals in the extension of the boundaries of understanding.

      What makes this particular software of interest to those gathered here is the wider implications of the underlying GUT-CP. This underlying theory has been substantiated in the following regard; the theory led to two cosmological predictions that, at that time, were considered to be absurd but were subsequently validated by astronomical observations.

      Too wit; that most of the mass in the universe would turn out to not give out any signature of it’s existence other than it’s gravitational effects on the visible universe (dark matter; as included in a complete explanation of what and why) and that the matter in the universe would be found to still be under acceleration away from the putative ‘centre’ (the argument had previously circled around whether the inertia imparted by the big bang would lead to continuous expansion into infinity or eventual collapse back to some order of singularity).

      On the surface of things there appear to be a number of interesting validations performed in support of the broader GUT-CP; a confirmation of the efficacy of the aforementioned software by one or more persons with the background, granting an open mind as a few oxen may be gored along the way, to properly evaluate the claims for the software would be of significance to many of us.

      My, and hopefully our, thanks to those who take up this burden as there just may be a very, very large dog attached to this tip of this particular tail.

  • Ryan

    Maybe they could leverage some of the newer technology being used to figure out whether a specific combination of materials will work better than others for desired specifics. It looks like Los Alamos has come up with a system like that to help filter through possible variables and find likely good candidates for specific goals.

    With something like that they could even broaden the scope and see if other material compositions might garner better results. Might also be a good way to narrow down some of the ideas on LENR. If they factor in that they think materials need to be x and y to have the effects needed and the system pushes out a bunch of materials that should be optimal for that and they don’t work then they can determine that perhaps those factors are not important and move on to other ideas more quickly.

  • Alan Smith

    Bless you. You did notice I said ‘at least 10’ meaning 10X the whole group. If I had mentioned a need for 360k experiments I suspect everyone would pack up and go home. 🙂

    • Omega Z

      Yes, this is not for the faint of heart.. It could easily be a lifetime commitment.

      • Alan Smith

        Well, if we knew what we are doing, it wouldn’t be research.

  • Curbina

    I am aware of the importance of models and simulation but I often see that models become a surrogate for reality and when researchers lose the perspective to the point that when the observations don’t fit the model is the observation, and not the model, what is put in doubt. I have witnessed people believing their models to the point of that, while oblivious to the weaknesses of the model. LENR is a phenomena that has been observed and is still observed and is not predicted by any model, thus is a case of confusion of model by reality. In this case we need to be doubtful of the model, not of the experimental observation.

    • Alan DeAngelis

      Yeah, like the simple minded approach of the deuterium-palladium system theorists who assume that d-d fusion is taking place because deuterium is being consumed and helium is being created (even though no gamma rays are seen), completely ignoring the fact that deuterium can transmute heavier elements (the Mitsubishi experiments). Therefore, it may be going through an intermediate.

  • Roland

    This is the tip of the iceberg; the potential variables in the single category of EM stimulation of the reaction, in the absence of some overarching theoretical guidance to narrow the search, dwarfs the possible permutations of the physical parameters.

    I would take one step further and posit that the correct EM stimulation will produce a result across a broad range within the physical parameters, and that a fairly complete understanding of the mechanisms of EM stimulation are foundational to the advanced E-cat designs; especially as this relates to running in controlled self sustain mode for lengthly periods.

  • Warthog

    Sorry, but BS. Science is still an experimental endeavor. NO computer model is worth the electrons to run the computers UNLESS verified by experiment. Garbage in = garbage out is still very true.

  • Jimmy, the lack of a theory AND the general lack of acceptance of the field, is exactly why it probably had to be a person like Rossi, doing it the old style with systematic and tiresome step-by-step experimental work, changing one parameter at a time, who might have succeeded. Funny.

    What could be done with computers though (which has been suggested), would be using AI to gather all information and data in published LENR papers and working it through to come up with the essentials and even suggest new potentially viable approaches.