This was triggered by Jed Rothwell posting a link to http://amasci.com/weird/skepquot.html , which is Bill Beaty’s collection of sceptical quotes. Here are a few of them but you might find it interesting to consider the truths contained in the collection:
“The most erroneous stories are those we think we know best — and therefore never scrutinize or question.” -Stephen Jay Gould
“Science today is locked into paradigms. Every avenue is blocked by beliefs that are wrong, and if you try to get anything published by a journal today, you will run against a paradigm and the editors will turn it down” – Sir Fred Hoyle
“Theories have four stages of acceptance: i) this is worthless nonsense; ii) this is an interesting, but perverse, point of view; iii) this is true, but quite unimportant; iv) I always said so. -J.B.S. Haldane, 1963
“Everything we know is only some kind of approximation, because we know that we do not know all the laws yet. Therefore, things must be learned only to be unlearned again or, more likely, to be corrected.” – Richard Feynman
The central point is that there’s a lot we don’t know, and though we do know a lot there are probably things we think we know but don’t, and there’s a lot more we don’t know that we don’t know.
Revolution-Green was started as a place for people to put sceptical comments on the various Free Energy ideas that were promoted by PESN and others, since such comments were being censored on PESN. Since I was asked to be the moderator, that also meant I needed to read everything that was put onto the site, and since I was also commenting I also had to explain why the various proposed methods wouldn’t work. Some of course are easy, where the inventors have mixed up power and energy or have got some calculation wrong. Some are measurement errors, since a staple of Free Energy ideas is the pulse motor idea and using spikes of energy one way or another, and measuring these exactly is very difficult. You can even run computer simulations of pulse motors where the simulation shows a gain in energy from “nowhere”, since fast changes of current or voltage need to use very small timesteps to get a close-enough approximation to the real waveform. Still, there’s always the chance that our theories are wrong and that this new idea might actually work despite our scepticism, so I needed to get a faster way of analysing the inventions shown than simply waiting until a lot of people had invested (and lost) money on it.
Although I’ve taken as an axiom that mass/energy is conserved, with E=mc² giving the conversion factor between mass and kinetic energy, it is of course possible that that may not be true, so that if a cyclic process is said to break CoE and produce energy from “nothing”, then I needed to have an analysis of the process and pinpoint where that energy was entering the system and, maybe more importantly, why that should happen. At the start, I also took it as an axiom that Perpetual Motion was impossible, too, given that I was taught that a long time ago and I’ve since seen nothing that will continuously produce energy from nothing – there would however be no objection if the source of that energy was identifiable and logical.
The process I used was thus to quantify the energy and work entering and leaving every stage of the process, and to reduce all the figures to joules. Keeping everything in the same units is tidy and saves errors. If the whole system is supposed to produce energy, then at least one of the sub-processes must produce more joules out than went into that process, and there must be a source for those joules. If you can’t see that source, and see both that it is of sufficient capacity and that there is a reason that the energy will move, then no energy will enter the system. Pretty simple logic, really.
One of the things we notice when looking at systems of random-direction energy is that that energy will disperse, and that the system tends towards an end-point where the concentration of energy is the same in all the locations that energy can physically go to. Put heat into a system at one point, and it will spread out by random-walks until the heat (or kinetic energy) is even everywhere within the system and it is all at the same temperature. On the macroscopic scale we see that heat moves from hotter to colder, and we do not see any situation where heat energy will be transferred from a colder place to a hotter one. This is easily confirmed, and is obvious, and is also enshrined in the 2nd Law of Thermodynamics (2LoT). Still, look at that first quote from Stephen Jay Gould – this is one of those things that are so obviously true that we don’t question it. If you’ve been following here for a while, though, you’ll have seen that I have questioned this obvious truth at http://revolution-green.com/heat-move-hotter-colder/ since as I pointed out this is a random process and so shouldn’t have a bias. At the microscopic level, the temperature of the source and destination have no effect on the collisions of molecules or the direction of the energy emerging from those collisions. What is actually happening is that energy is spread out by those random processes so that the average energy concentration tends to equalise – this makes the hotter volumes cooler and the cooler volumes hotter (so yes, overall heat does move from hotter to colder). When we look at the process we see that our temperature measurement itself is an average over time, and the process of averaging always loses detail. When we look at that detail in thermodynamics, though, it turns out that it is actually pretty important, and that what seems impossible when looking at the average state becomes simple to do when looking at the details. When we talk about temperature, we are always talking about an average, and there is an underlying assumption (not often exposed) that the range of kinetic energies will follow the Maxwell-Boltzmann curve and that the statistics will thus be valid. That is not necessarily true.
It’s always bothered me that, though atomic and molecular processes seem to be lossless, we seem to have that one-way trip from higher energy concentrations to lower ones, and that the universe is supposed to become gradually more and more random until there is no order left. As things cool down, they condense into liquids and solids (which seem to me to have more order, not less) , and of course when I look at the night sky I don’t see things spread out but instead I see stars where matter has collected. Gravity and electrostatic forces have imposed order on the universe. A galaxy is more-ordered than a random scattering of stars. Here on Earth we don’t have everything mixed up, but there are strata of rocks and specific places we go in order to mine specific minerals. Does Entropy always increase, the way the theory is so insistent? Of course, I can show that mathematically, but does the maths reflect what’s actually happening? The derivation uses the idea that we are dealing with random processes (again) but have we taken into account the effect of the known force-fields in that derivation? It seems to me that we haven’t, and that the derivation is thus wrong or at least insufficient to describe the real world.
Going back to the random collisions of a gas at some temperature, if we apply a force-field to them that is sufficiently-strong then the situation will be modified from the pure random situation we base our thermodynamic theories on. Let’s say the gas is a plasma consisting of positive and negative charged particles. If we apply a magnetic field, then those particles will gyrate in different directions and can thus be separated. If we apply a strong-enough electric field, then no matter what the initial direction of a particle after a collision, it will end up going one direction. We use these principles pretty often in fact. A mass-spectrometer will sort ionised molecules by mass and energy (get the energy-level precise and it will sort by mass), and the high internal electrical field of a PN junction in a solar cell will redirect electrons, emitted by the photoelectric effect in the depletion zone, to ensure that any emitted electrons end up on the correct electrode and give rise to an external current. It is difficult to argue that these devices don’t work. Solar cells are an important source for renewable energy.
Overall, then, we can see that the force fields reduce the randomness of a process, and that if they are strong enough then can make such processes non-random. The force-fields are conservative – that is to say they cannot give you back any more energy than you’ve put into them. They can and do however change the vector of the momentum of the energy. If the processes are no longer random, then the mathematics based on random collisions is no longer valid. It’s quite reasonable, therefore, to speculate that in this non-random situation that entropy may in fact decrease, and that a strong-enough force field may produce order from disorder.
This train of thought thus leads to ways of making a physical device that will re-order the random directions of heat into an overall single-direction of energy. I’ve covered some of these in http://revolution-green.com/work-and-play/ but of course I haven’t covered all the possible ways. I’m lazy, and have only detailed ones that are relatively easy to actually make. Only relatively, since these sorts of devices need some pretty small working dimensions to do the job and it needs techniques from chip-fabrication. I’m still working on getting real devices, but should have some results soon. Until we have the data, this line of thought is speculation based on logic, but isn’t yet confirmed. If the logic is correct (and no-one has shown that any of the logical steps are wrong) then Perpetual Motion is just as possible at a human scale as is is at atomic scale, and instead of needing to burn stuff to get a source of energy we can instead simply recycle the energy we already have. For things that store energy, such as lifting bricks to make a house, then we’ll need to withdraw energy from the environmental bank for a relatively long time, but for the majority of the work we do (which doesn’t store energy) then the energy will return quickly to the environment as waste heat that can again be recycled. In any case, there’s a lot more energy coming from the Sun each day that will replace the (unusable for now) stored energy. This version of Perpetual Motion has an obvious and easily-identified source of its energy, and of course the environment will cool when you take it.
One question is why this hasn’t been found out before, when so many people have tried to achieve Perpetual Motion over the centuries. A big part of the answer is the experience that heat always moves from hotter to colder, which is why that question of why that happens is so important. The daemon is in the detail, and since thermodynamic terms deal in averages then that detail is normally lost. The random-walk of energy means that, overall, heat will still move from hotter to colder, but the devices that rectify the direction of that heat energy (and output this energy as electricity on wires to a load) will appear to be colder than their surroundings, thus there is an obvious reason for the heat energy to go into the device. Still, maybe the biggest reason for this not being seen is that people believe it is impossible and thus the only people who try will be crackpots. Generally, though, crackpots don’t have a solid grounding in science and thus try magnet-motors, gravity-wheels and other devices that attempt to break CoE and create energy from nothing, since history abounds with claims that people succeeded even though no such machines have been shown to work in a proper scientific test.
Given that this idea goes against the consensus, and requires people to look carefully at the details rather than accept the averaged numbers, I accept that it’s only going to be by producing a real device that I can convince people it’s true. Still, once that device is produced and confirmed I’d expect people to work out better (and cheaper) ways of doing it than I have so far proposed, and that we’ll over time be able to mostly recycle energy rather than needing to burn something or other to produce heat that is then converted using a Carnot-limited process into electricity. This is free energy in the same way that solar panels will give you free energy, in that you’ll need to buy (or make) the panels to start with but after that you pay no more until they need replacing. I can’t see a failure mechanism in the designs I’ve produced, though, except for atomic diffusion through drawing too high a current, so the devices should last a lifetime or more, so since they can be buried out of the sunlight and weather then there is not a high risk of failure.
Every so often, I have discussions with people who believe that their idea can break CoE, and that all they need is loadsamoney in order to make this amazing device and everyone has Free Energy ever after (and of course the inventor gets somewhat rich, too). They may find someone to finance them (there are a lot of historical cases where such claims have been backed) but as I see it you don’t get energy out unless it also goes in. There has to be a source of that energy, just as certain as there needs to be a sink. We use the movement of energy to do work, but the energy itself can’t be created or destroyed. I’ve annoyed several people by showing that their device simply won’t work. They often don’t believe me, either, but keep on trying to eff the ineffable. Maybe one of these days someone will prove me wrong on this, but I doubt if anything that is easy to make and to understand will show such an anomaly given the effort that has already gone into trying out such ideas. There may be some anomalies showing up as we become able to measure things to more decimal places, but I don’t really expect CoE to be invalidated, though maybe we’ll find ways in which it can be transported from (there) to (here) by some new process. It is after all just a matter of tweaking the probabilities. There’s still a lot more unknown than is known.
There’s a bit of a knee-jerk reaction and scepticism when some idea goes against what you know is true. Sometimes, though, you need to look closely at what you know to be true in case you haven’t thought it through enough.