Current location - Quotes Website - Famous sayings - Can the theory of thermal silence be proved wrong in physical experiments or theories?
Can the theory of thermal silence be proved wrong in physical experiments or theories?
In the early stage of the development of thermodynamics, R.J.E Clausius and W.Thomson (that is, LordKelvin) abused the second law of thermodynamics in the whole universe, and came to the absurd "theory of the hot dead universe", thinking that the whole universe is experiencing entropy increase, and eventually the whole universe will reach thermal equilibrium, the entropy value will reach the maximum, the temperature difference will disappear and the pressure will become uniform.

In the19th century, few scientists could realize theory of heat death's fallacy. Only niels bohr, L.E. Boltzmann and Maxwell left records in the literature. As early as 1866, it was only one year after Clausius put forward the argument that "the entropy of the universe tends to be the maximum", and even before Clausius could further develop into the theory of the hot dead universe (Clausius said that the universe would have a hot dead in 1867), Boltzmann noticed the fact that the growth process of organisms was opposite to the increase of entropy. He said: "The general struggle of biology is not for matter, nor for energy, but for entropy (in this context, niels bohr Ziman's sentence means that biological process is a struggle against the increase of entropy-author's note). This kind of struggle is valuable in the process of energy transfer from the hot sun to the cold earth. In order to make the best use of this transfer, the plants spread out leaves with an immeasurable area, forcing the sun to complete chemical synthesis in a way that we don't know how to do in the laboratory. " 1895, niels bohr Ziman further put forward the theory of "micro-fluctuation" to refute the theory of heat death.

Maxwell also vaguely realized that there is an energy control mechanism in nature, which is opposite to the increase of entropy. But he couldn't explain this mechanism at that time. He can only assume that there is a kind of "shemale-like", which can distribute particles with random thermal motion into certain compartments according to certain order and rules. This is the famous concept of "Maxwell Demon" in 187 1.

Because Maxwell's demon is just a guess, it is certainly impossible to solve the problem of the theory of thermal silence in the universe. Boltzmann said that photosynthesis of green plants is opposite to entropy increase, and it needs more negative entropy from sunlight, that is, at the expense of greater entropy increase of the sun. As for micro-ups and downs, it is far from enough to counter the extremely huge entropy increase process in the universe (such as the aging and death of stars and the expansion of the universe itself). Therefore, the theory of heat death of the universe has become a big problem left by natural science from19th century to 20th century.

19 14 years, m Ruhoff first revealed the absurdity of Maxwell's demon. He raised the issue of "demon" metabolism. He pointed out that the "devil" of the intervention system should be regarded as a part of the system, otherwise it is not an isolated system. At that time, Small Ruhoff's ideas were too crude to convince physicists.

Under the influence of Small Ruhoff's works, Leo Szilard deeply analyzed the principle of Maxwell's demon action. 1929, a paper by Zillard was published in the German physical magazine "Elves' intervention reduces the entropy of thermodynamic system". First of all, Zillard proposed that entropy reduction must be compensated by a physical quantity of the system, and the compensation of this physical quantity is actually to increase information. Zillard's work is a pioneer of modern information theory, and he also put forward a formula for calculating the amount of information:

I=-k(W 1lnW 1+W2lnW2)

Where w is the thermodynamic probability. Zillard also put forward the concept and term of "negative entropy" which has never appeared in classical thermodynamics for the first time. Zillard's groundbreaking paper was not fully understood at that time. What is even more regrettable is that he himself did not continue to explore along this road.

1944, E. Schrodinger, a famous physicist, one of the founders of quantum mechanics and a Nobel Prize winner, published What is Life? The concept of negative entropy is discussed clearly, and it is applied to biological problems, and a famous saying that "organisms live by negative entropy" (or "organisms feed on negative entropy") is put forward. Schrodinger said: "The only way to get rid of death, that is, to live, is to constantly absorb negative entropy from the environment. We will soon understand that negative entropy is a very positive thing. Living things live by negative entropy. Or more accurately, the essence of metabolism is to let the organism successfully eliminate all the entropy it has to produce when it is alive. "

The concept of negative entropy was not easily accepted by people at first. Schrodinger himself clearly wrote: "The theory of negative entropy has been doubted and opposed by colleagues in physics. The first thing I want to say is that if I just want to cater to their wishes, then I should use free energy instead of discussing this issue. " Schrodinger realized the connection between negative entropy and free energy from the beginning, which shows that he has a keen eye and profound thoughts. If there is a mechanism, it is an open system, which can continuously obtain and accumulate free energy from the outside, it will produce negative entropy. Organisms are such institutions. Animals get free energy (or negative entropy) from food, while green plants get it from sunlight. This is really "living on negative entropy"! Later, the famous Russian-American theoretical physicist and popular science writer G.Gamow also discussed this issue in a popular book.

Second, entropy and information.

The concept of entropy in classical thermodynamics was first put forward by Clausius. It is defined as

Namely "thermal Wen Shang", as a measure of irreversibility of thermodynamic process. Statistical mechanics gives us a deeper understanding of the nature of the concept of entropy. Entropy in statistical mechanics is defined as Boltzmann relation;

Where w is the probability (thermodynamic probability) of the molecular thermal motion state. In this way, entropy is a measure of the probability of random thermal motion of molecules, that is, the degree of chaos or disorder of molecular thermal motion.

If the object of discussion is not limited to molecular thermal motion, we can also use the concept of entropy to describe the degree of chaos or disorder of any other substance, anything or any non-molecular thermal motion system. In this way, we can have another concept about entropy, which is a generalization of the concept of entropy in thermodynamics and statistical mechanics and can be called generalized entropy. Generalized entropy can also be defined by Boltzmann relation, but w in the formula can be the number of possible motion States of any material motion mode.

Generalized entropy can also be said to be the degree of uncertainty (uncertainty) about the motion state of things, which is actually the concept of entropy in information theory and cybernetics. Almost at the same time, R. A. Fisher, N. Weiner and C. E. Shannon expressed this concept mathematically. It is also defined by probability:

When we get enough information, the degree of uncertainty about the motion state of things, or the entropy of elimination (or reduction), can be called negative entropy, that is, the amount of information:

The amount of information represents the degree of order, organizational structure, complexity, specificity or evolution of the system. This is the contradictory opposite of entropy (disorder, uncertainty and chaos), that is, negative entropy.

L.Brillouin, H.Linschitz and L.Augensine have preliminarily discussed the relationship between entropy of information theory and entropy of thermodynamics. In the expression of mathematical formulas, we compare (2) and (4), so we have:

According to formula (5), as long as the unit is converted, the negative entropy value can be expressed by information quantity, and the negative information can also be expressed by entropy.

In the literature, entropy and information have been expressed in many different units or symbols, but there are only two concepts. One is the entropy of thermodynamics, which can only be applied to the thermal motion of molecules or other particles and a specific way of material motion. It can be derived from experimental data (empirical physical entropy) or from the statistical theory of molecular motion (theoretical physical entropy). The other is generalized entropy, which comes from information theory and cybernetics. It can be used to describe the degree of chaos or disorder of any material movement mode (including life phenomena). The opposite of its contradiction is called negative entropy or information quantity, which is the expression of the complexity or order of organizational structure. The concept of generalized entropy is broader than that of thermodynamic entropy, and the thermodynamic process can be simplified to thermodynamic entropy (through unit conversion). However, thermodynamic entropy cannot be applied to non-thermodynamic processes, because the concept of thermodynamic entropy is limited to the specific material motion mode of particle thermal motion, which has a specific proportional relationship with the distribution of energy (heat). It is not suitable for non-thermodynamic processes that do not involve thermal energy conversion. It can be said that the concept of thermodynamic entropy is included in generalized entropy.

Thirdly, from irreversible process thermodynamics to dissipative structure theory.

In the 1940s, a series of new concepts appeared in science, which had an impact on classical thermodynamics. In addition to the aforementioned negative entropy concept put forward by Schrodinger, the generalization of entropy concept in cybernetics and information theory, and the "non-equilibrium steady state" thermodynamic theory put forward by the Brussels School headed by i. I.llyaPrigogine. In 1950s, it further developed into "irreversible process thermodynamics", and finally developed into dissipative structure theory in 1970s. Dissipative structure refers to the orderly structure of space or time maintained by external energy flow, mass flow and information flow under the condition of being far from equilibrium. It changes constantly with the external input and can organize itself, which leads to the decrease of the entropy of the system itself. I.llyaPrigogine demonstrated the existence of dissipative structure in theory by mathematical methods, and made in-depth research with his non-equilibrium state and nonlinear thermodynamics theory. Dissipative structure is of great significance in some physical and chemical processes, automatic control systems and biological processes, which is helpful to clarify the organizational structure and orderly growth phenomenon in life phenomena. Due to his outstanding contribution in this field, Prigozin won the Nobel Prize in Chemistry from 65438 to 0977.

In 1950s, Prigogine pointed out in his book Introduction to Thermodynamics of Irreversible Processes that the description of non-equilibrium state in thermodynamics of irreversible processes "is surprisingly consistent with the remarkable characteristics of biological organisms." "When an organism grows, it actually shows the fact that the entropy decreases when it develops to a steady state." "The fact that the organizational structure of organisms generally increases corresponds to the decrease of entropy." Therefore, Prigogine said: "From the point of view of classical thermodynamics, the behavior of biological organisms always seems so strange, and the applicability of thermodynamics to such a system is often questioned. We can say that from the thermodynamic point of view of open system and steady system, we have gained a better understanding of their main behaviors. " Grut also pointed out that "the (biological) system reaches the minimum rate of entropy increase per unit mass in the final stage of growth. In this process, entropy itself is decreasing, and at this time, the growth of organizational structure occurs in the organism. " "Evolutionary theory says that the inherent complexity trend in this process is consistent with the entropy reduction mentioned above."

Prigogine and Degrut said that the growth of organism's organizational structure corresponds to the decrease of entropy. The entropy mentioned here is actually the entropy of information theory (generalized entropy) rather than the entropy of thermodynamics. It seems that I.llyaPrigogine also noticed this later. Therefore, he carefully avoided using entropy reduction or negative entropy to refer to ordering in dissipative structure theory. He just said that dissipative structure depends on the negative entropy flow of environmental input to produce order, but he never said that order is also easy to reduce entropy. This is the rigour of Prigogine. He confined the whole dissipative structure theory to thermodynamics. Even "non-equilibrium, nonlinear" thermodynamics is still thermodynamics!

However, Prigozin does not miss the past times of classical thermodynamics, but claims that "the main energy of life is in the future" and belongs to the optimist of the future. Through a series of works and speeches, we can see that Prigozin is planning a more ambitious goal: how to unify the development laws of natural science, life science and social science, that is, March towards broad unity.

To achieve this goal of unification, I am afraid it is impossible not to completely break through the framework of thermodynamics. In fact, Prigogine has launched a breakthrough attack on classical thermodynamics from both non-equilibrium and nonlinear aspects. Although it has not been done thoroughly at present, he began to realize the significance of the concept of information theory to the development of dissipative structure theory. He himself said that he "used physical and chemical language" in dissipative structure theory. Others may like to say negative feedback, or automatic adjustment. Therefore, it is feasible to closely link our discussion with information theory. "H.J.Bremermann said more thoroughly:" We can't infer the structure of living things only from energy dissipation, but more importantly, information. "Biological systems and social systems are not dissipative structures of thermodynamics, but information systems. Only the concept of generalized negative entropy of information theory is the factor to unify them. If the study of dissipative structure and negative entropy can be combined with the study of information theory and cybernetics, there may be a new breakthrough.

Fourthly, information thermodynamics.

Since the concept of entropy (generalized entropy) in information theory includes the concept of thermodynamic entropy, can we generalize the whole thermodynamics from the concept of information theory, or establish a more general theoretical system for studying information systems, with thermodynamic system as its special case? The following attempts to do some preliminary exploration from this aspect.

Thermal system and AC system for thermal process, if there is no difference or contradiction between hot and cold, heat cannot be transferred and transformed. A single heat source can neither transfer heat nor do work. In order to make the molecules in thermal motion move in a certain direction to transfer heat and do work, it is necessary to use a cold source to control the direction of molecular motion and make the heat flow from high temperature to low temperature. The function of cold source here is to provide information to control the direction of heat energy transfer.

From the perspective of information theory, cold source is a kind of information source. In the process of heat transfer, the cold source receives part of the heat from the heat source, and the chaos of molecular motion increases. From the perspective of information theory, heat source is noise source, which interferes with cold source. In this way, we can regard the thermodynamic system composed of "heat source-heat engine-cold source" as a communication system with the concept and terminology of information theory, and the heat transfer process can be regarded as a communication process.

The information theory expression of the second law of thermodynamics uses the terms of information theory to express the second law of thermodynamics, that is, if new information is not obtained from the outside world, it is impossible to increase the amount of information or reduce the uncertainty through the operation and transformation of information. The information theory expression of the second law of thermodynamics has a wider meaning, which can be applied to the information transmission or conversion process of any non-thermodynamic process, so it can be called the second law of generalized thermodynamics.

Heat and work heat are irregular random movements of particles and an uncontrolled form of energy. When energy does work, it is a regular form, and the transmission of energy in the form of work can be controlled and managed. It can be said that heat is an energy form without information, while work is an energy transfer form with information. Therefore, when the information provided by the cold source through the heat engine is used to control and manage the energy transfer direction of the heat source, work can be obtained. Work is a form of energy transfer with information. When it is disturbed by noise, it will lose information and convert it into heat energy. For example, friction, an irregular form of mechanical movement, will produce "noise", which will make information lost and turn work into heat.

The information theory expression of the second law of thermodynamics tells us that any automatic thermodynamic process will always lose information. So work can lose all the information it carries and completely convert it into heat. However, under the condition of not causing other changes in the outside world, heat cannot be completely converted into work, because without additional information provided by the outside world, the loss of information cannot be supplemented. Similarly, electric energy, light energy, chemical energy, etc. They are all forms of energy with information, which can be converted into heat energy. However, under the condition that the outside world does not provide additional information, it is impossible for all heat to be converted into any other form of information-carrying energy.

The transfer and transformation of binding energy and free energy must be controlled by information. For example, when two objects with the same temperature interact in thermodynamics, when they do work from time to time, heat transfer is impossible because of lack of information. However, both of these objects contain heat energy, which cannot be transmitted and transformed due to lack of information, and is called binding energy. Waste heat is a kind of combined energy, which cannot be used unless other information is provided.

When there is a temperature difference between two objects, they interact thermodynamically, resulting in one-way heat transfer. This is because the cold object provides information to the hot object, thus controlling the heat transfer from the hot object to the cold object. The heat that can be transferred is exergie. On the other hand, the cold object itself has a certain temperature, and the inside has random thermal motion of molecules, so it is constantly disturbed by the hot object in the process of interacting with the hot object, so it cannot provide complete information. When two objects reach a thermal equilibrium state with equal temperatures, energy transfer can no longer be carried out without available information. At this time, it is 0, leaving only the binding energy or ""(anengie or anexergie).

In thermodynamics, the free energy F=U-TS, where u is the total internal energy. Because the thermodynamic process is disturbed by the molecular thermal motion itself, the information is lost, that is, because of the existence of entropy S, part of TS cannot be transferred and transformed, and the TS term is the binding energy.

Reversible process and irreversible process For reversible process, when it goes forward and then returns to the initial state in the opposite direction, it will not cause any changes in the surrounding environment, and the ability of energy transfer or transformation will not be lost. Therefore, the reversible process is essentially a process without losing information.

The ideal Carnot reversible heat engine can run reversibly, because there is no air leakage, friction and any other loss, so there is no information loss. The so-called quasi-static process envisaged by thermodynamics, each step of the process is in a continuous equilibrium state, the change is infinitely small, and the process is infinitely long, so each step has almost no information loss, so it is reversible. This is equivalent to "normal transformer" or "nonsingular transformer" in information theory.

Irreversible heat engine has friction that converts work into heat, and the random thermal motion of molecules caused by friction disturbs the information transmission and loses it. Therefore, the efficiency of irreversible heat engine is less than that of reversible heat engine. Irreversible heat engine has information loss, which is equivalent to "irregular transformer" or "singular transformer" in information theory.

Verb (abbreviation for verb) information and energy.

Excellent instructions. For dissipative structures, the input negative entropy is also proportional to the input energy. But for the information system, there is no such proportional relationship between the input information and the input energy. For example, a radio or TV set, the information it inputs is the carrier signal of the radio or TV station received through the antenna, and the strength of the signal is not proportional to the information contained in the signal itself. There is also no direct ratio between the information content of the input signal and the power provided by the power supply. Here, the negative entropy of thermodynamics is input from power supply, and the negative entropy of information theory is input from antenna. The internal order of the system, such as the order of the picture on the screen of the picture tube or the order of the audio vibration of the speaker, can also be described by the negative entropy of information theory. Although this ranking is based on the thermodynamic negative entropy input of power supply, there is no causal relationship between them. The negative entropy of information theory of antenna input is the reason for the internal ordering of this kind of information system. Just as there is no causal relationship between the order of human brain activity and eating (negative entropy of energy supply or thermodynamics).

Different energy consumption can transmit the same amount of information, but different amounts of information can be transmitted with the same amount of energy. For example, sending the same telegram with different power, they transmit the same information, but consume different energy; But two telegrams with different amounts of information can be sent with the same power. Therefore, in order to save energy meters, actual information systems often use a very small amount of energy to transmit a very large amount of information. This is true whether it is communication system, automatic control system in engineering technology or life system caused by nature itself. For example, inputting a large number of instructions into an electronic computer consumes very little energy. Compared with the energy consumed by muscle activity itself, the energy consumed by animal nervous system to direct muscle activity can be ignored.

An information system inputs a large amount of information carried by a small amount of energy from an information input device (such as an antenna), which can control the change of a large amount of energy provided by a power supply, for example, into an ordered image on a TV screen. Automatic control systems can usually control larger energy changes. This is the principle of controlling large energy with small energy or the principle of information amplifier.

For the dissipative structure of thermodynamics, its internal order is caused by a single thermodynamic negative entropy flow, and there is no other information flow input, which is why Prigozin can avoid the concept of information. Because of this, there is a definite proportional relationship between the input negative entropy and energy. In other words, there is no information amplification mechanism in the dissipative structure of thermodynamics. But for the information system, the negative entropy flow of thermodynamics (such as power supply) is separated from the negative entropy flow of information theory (such as the information flow input by antenna), and the information amplification mechanism appears. The degree of order in the system is caused by the input information flow, and there is no causal relationship and proportional relationship between it and the negative entropy of power input.

The essence of Maxwell's demon problem is to use information to control the transfer or transformation of energy. 1929, L. Szilard published a paper on entropy, which was regarded as the forerunner of Shennong's information theory. In the paper, he proposed that Maxwell demon would pay the price if it wanted to reduce the entropy of the system it controlled-it would increase the entropy itself. 1948, wiener also pointed out in his book cybernetics that "Maxwell's demon must receive information about the particle's speed and position before it moves". In 1950s, Brillouin applied the information theory of entropy to explain it, pointing out that Maxwell's demon had to obtain information from the outside world in order to distinguish the speed of particles, which caused a greater entropy increase in the environment. In other words, Maxwell demon must get more negative entropy from the environment at the expense of sacrifice. In this way, the problem of Maxwell's demon was finally solved.

The solution of Maxwell's demon problem is not only the end of the old problem, but also the beginning of the new one. Wiener said, "It is easier to refuse Maxwell's question than to answer it. It is easiest to deny the possibility of this thing or this structure. Strictly speaking, Maxwell's demon can't exist, but if we accept this from the beginning without demonstrating it, we will lose a rare opportunity to study the systematic knowledge about entropy and the possible significance of Maxwell's demon in physics, chemistry and biology. " If the ordered system with negative entropy input is regarded as a modified Maxwell demon, we will have a unified concept to study all negative entropy open systems, including dissipative structure, information system and life system. Maxwell's demon, whose meaning has been revised, does not violate the second law of thermodynamics. It is a colorful and magnificent historical drama staged on the stage at the expense of the negative entropy provided by the environment. The second law of thermodynamics only tells us that every such historical drama will end sooner or later. And our task is to study every historical drama and direct a historical drama with higher level and more information! Schrodinger's concept of negative entropy, Wiener's cybernetics, Shennong's information theory and Prigozin's dissipative structure theory are all magnificent historical dramas on the scientific stage. Perhaps, a more magnificent scientific historical drama is waiting for us to direct it!

Sixth, negative entropy and cosmology

Maxwell's demon problem has been solved, as well as the theory of thermal silence in the universe. Dissipative structure and all other Maxwell monsters with decorative significance rely on the negative entropy input of the environment to produce order, so this order is at the expense of greater entropy increase in the environment. If the dissipative structure and its environment are regarded as a whole system, then this system will still produce entropy increase. In fact, Prigozin himself did not put forward the task of solving the theory of thermal silence in the universe for the study of dissipative structures. Comrade Qian Xuesen said: "Prigozin's theory is very enlightening. It liberates us from the suffocating atmosphere of classical thermodynamics, and there is no need to summon Maxwell demon to reduce entropy somewhere. " If we understand this sentence as the theory that Prigozin's theory has solved the problem of heat silence, it is wrong.

Engels said a long time ago: "Only by pointing out how to reuse the heat radiated into the universe can we finally solve this problem." At the same time, Engels also clearly predicted: "The heat radiated into space must be transformed into another form in some way (indicating this way, which will be the subject of future natural science), in which it can be recombined and moved. Therefore, the main difficulty in preventing the dead sun from turning into a thermal nebula has disappeared. " How can the scattered radiation in the universe be concentrated again? Is there a mechanism to attract radiation? Oh, that's a black hole! Black holes have a strong gravitational force, and the gravitational field is highly curved to the surrounding space, so that light cannot radiate. Within the gravitational range of a black hole, everything, including radiation and the energy it carries, will be accreted by it. Even the radiation emitted by stars, 2.7K cosmic microwave background radiation or any other waste heat can be absorbed by black holes. This will lead to a high concentration of mass and energy in some areas of the universe. Recent studies have shown that the energy gathered in this way may be reactivated and released. For example, British theoretical physicist S.W. Hawking discussed the theory of black holes by combining general relativity, thermodynamics and quantum mechanics, and proposed that black holes can emit particles through the "tunneling effect" of quantum mechanics, thus "evaporating". In the final stage, the black hole evaporates very quickly, so it will eventually be a violent explosion. It is also assumed that the explosion of black holes may produce new stars and galaxies.

Perhaps black holes may have other ways to release energy. In a word, the energy gathered in a black hole is not necessarily bound energy, but free energy that may be transformed and reactivated. Black holes can obtain the supply of high entropy mass energy from the outside (such as diffuse radiation or "waste heat" in the universe), but accretion and mass-energy conversion of black holes can turn them into low entropy mass energy. In a sense, the black hole itself may produce negative entropy, and it does not need to obtain negative entropy flow from the outside. I. Asimov, a famous popular science writer, said: "In a black hole, the second law of thermodynamics is reversed, so although most parts of the universe are declining, the black hole is gradually reviving."

Black holes can cause local contraction of the universe, but not enough to counter the expansion of the whole universe. The expansion of the universe began with the Big Bang, which is usually regarded as the arrow of cosmic time-the origin of entropy increase. Therefore, in order to finally solve the problem of thermal silence theory, we must also find the mechanism of cosmic contraction. According to Einstein's theory of gravity, the future universe will shrink. Einstein's theory of gravity predicted the development of the universe. The universe began with the Big Bang, expanded to the maximum, and then contracted or even collapsed. This prediction was later popularized by many scholars (R.C. tolman, A.Arvid, R.P. Jeroche, S.W. Hawking, R.Penrose).

There are signs that neutrinos have rest mass. The big bang produced a billion times more neutrinos than the total number of particles in other substances. Even if neutrinos have only a little mass, the total mass of neutrinos in the whole universe will greatly exceed the total mass of all other substances. It is said that according to the measured neutrino static mass, the total mass of neutrinos accounts for more than 90% of the total mass of the universe. Therefore, neutrinos may be the key factor to control the expansion and contraction of our universe. Some people think that the contribution of neutrinos to the density of the universe may lead to the contraction of the future universe. In this way, theory of heat death's final position in the universe was breached!