Science & Technology

A Brief Concept of Second principle of thermodynamics

SECOND PRINCIPLE OF THERMODYNAMICS

The second law of physical science offers a certain definition of a property referred to as entropy.Entropy can be considered as a measure of how close or not a system is to equilibrium; It can also be considered as a measure of the (spatial and thermal) disorder of the system. The second law states that the entropy, that is, the disorder, of an isolated system can never decrease. Therefore, when an isolated system reaches a maximum entropy configuration, it can no longer undergo changes: it has reached equilibrium. Nature seems to “prefer” disorder and chaos. It can be shown that the second principle implies that, if work is not done, it is impossible to transfer heat from a region of lower temperature to a region of higher temperature.

Entropy: It is a function of the state of the system, since it has a unique value for each state, regardless of how the system arrived at that state.

ΔS =[tooltip id="5817ef7c4a96c9e7a1b11df0d41a1e7b"] ">[/tooltip]ΔQ / T

Entropy is an intrinsic property of S T D primarily related to measurable parameters that characterize it.

dS = dQ / T

dS: entropy of S T D.

dQ: exchange of thermal energy between the medium and the S T D.

T: temperature at which the exchange of thermal energy between the medium and the S T D is recorded.

The expression allows the calculation of variations but not the knowledge of absolute values.

The entropic variation in any S T D and its jointly considered environment is positive, tending to zero in reversible processes.

ΔS Total Δ0 (irreversible process)

ΔS =0 (reversible process)

Entropy as a probability:

It is less likely that the 5 molecules are in the place of origin to which they are scattered after removing the separator. The greater amount of time will be scattered, so that state 2 will be of greater entropy than 1.

The equilibrium state of an isolated system is the one with the greatest disorder since it is the most likely.

The increase in entropy corresponds to an increase in molecular disorder.

The second principle imposes an additional condition on thermodynamic processes. It is not enough that energy is conserved and thus comply with the first principle. A machine that will perform work in violation of the second principle is called the “perpetual mobile of the second kind”, since it could continuously obtain energy from a cold environment to perform work in a hot environment at no cost.

1st law: S T D and environment ΔE S T D +ΔE Environment =0

1st law: S T D ΔE S T D =0

2nd law: S T D and environment ΔS S T D +ΔS Environment Δ0

2nd law: S T D ΔS S T D =0

When ΔS S T D =0, the system is in equilibrium and there are no transformations between the different types of energy.

When ΔS S T D>0, it is an unbalanced process and tending towards equilibrium, always with

ΔE S T D =0.

The one of most important laws of physics; Even though they can be formulated in many ways, they all lead to the explanation of the concept of irreversibility and that of entropy. This last concept, when treated by other branches of physics, especially by statistical mechanics and information theory, is linked to the degree of disorder of matter and energy of a system. Thermodynamics, meanwhile, does not offer a physical explanation of entropy, which is associated with the amount of unusable energy in a system. However, this purely phenomenological interpretation of entropy is totally consistent with its statistical interpretations. Thus, the water will have more entropy in a gaseous state with its dispersed molecules and away from each other than the same in a liquid state with its molecules closer and more ordered.

The second principle of thermodynamics dictates that although matter and energy cannot be created or destroyed, they are transformed, and it establishes the sense in which such transformation occurs. However, the capital point of the second principle is that, as with all thermodynamic theory, it refers only and exclusively to equilibrium states.

Any definition, corollary or concept that is extracted from it can only be applied to equilibrium states, so that, formally, parameters such as temperature or entropy itself will be defined only for equilibrium states. Thus, according to the second principle, when you have a system that passes from a state of equilibrium A to another B, the amount of entropy in the state of equilibrium B will be the maximum possible, and inevitably greater than that of the state of equilibrium A. Obviously, the system will only do work when it is in the transition from the state of equilibrium A to B and not when it is in one of these states. However, if the system was closed, its energy and quantity of matter could not vary; If the entropy must be maximized in each transition from one state of equilibrium to another, and the internal disorder of the system must increase, a natural limit is clearly seen: it will increasingly cost more to extract the same amount of work, because according to statistical mechanics the Equivalent disorder should increase exponentially.

Applied this concept to a phenomenon of nature such as the life of the stars, they, when converting hydrogen, its main fuel, into helium generate light and heat. By fusing the hydrogen nuclei inside it, the star releases enough energy to produce them at that intensity; However, when he tries to merge the helium nuclei, he cannot release the same amount of energy he obtained when he fused the hydrogen nuclei. Each time the star fuses the nuclei of an element, it obtains another that is more useless to obtain energy and therefore the star dies, and in that order of ideas the matter that it leaves behind will no longer serve to generate another star. This is how the second principle of thermodynamics has been used to explain the end of the universe.

Axiomatic definition

The formal definition of the second principle of thermodynamics states that:

In an equilibrium state, the values ​​that take the characteristic parameters of a closed thermodynamic system are such that they maximize the value of a certain magnitude that is a function of said parameters, called entropy.

The entropy of a system is an abstract physical magnitude that statistical mechanics identifies with the degree of internal molecular disorder of a physical system. Classical thermodynamics, on the other hand, defines it as the relationship between the heat transmitted and the temperature at which it is transmitted. Axiomatic thermodynamics, on the other hand, defines entropy as a certain function –a priori, in an unknown way–, which depends on the so-called “characteristic parameters” of the system, and which can only be defined for the equilibrium states of the system.

These characteristic parameters are established from a postulate derived from the first principle of thermodynamics, sometimes called the state principle. According to the latter, the equilibrium state of a system is completely defined by means of the internal energy of the system, its volume and its molar composition. Any other thermodynamic parameter, such as temperature or pressure, can be defined as a function of these parameters. Thus, entropy will also be a function of these parameters.

The second principle of thermodynamics states that such entropy can only be defined for states of thermodynamic equilibrium, and that among all possible equilibrium states – which will be defined by the characteristic parameters -, only one which, among all of them, can be given , maximizes entropy.

The consequences of this statement are subtle: when considering a closed system tending towards equilibrium, the possible equilibrium states include all those that are compatible with the limits or contours of the system. Among them is, of course, the starting equilibrium state. If the system varies its equilibrium state from that of departure to another, this is because the entropy of the new state is greater than that of the initial state; If the system changes its equilibrium state, its entropy can only increase. Therefore, the entropy of a thermodynamically isolated system can only be increased. Assuming that the universe started from a state of equilibrium, that at all times of time the universe does not move too far away from thermodynamic equilibrium and that the universe is an isolated system, the second principle of thermodynamics can be formulated as follows;

The amount of entropy in the universe tends to increase over time.

However, axiomatic thermodynamics does not recognize time as a thermodynamic variable. Formally, entropy can only be defined for equilibrium states. In the process that goes from one state of equilibrium to another there are no states of equilibrium, so entropy in these non-equilibrium states cannot be defined without incurring formal inconsistencies within the thermodynamics itself. Thus, entropy cannot be a function of time, so talking about variations of it over time is formally incorrect.

When it is done, it is because it has been presupposed that in the process of one state of equilibrium to another it has gone through infinite intermediate states of equilibrium, a procedure that allows time to be introduced as a parameter. As long as the state of final equilibrium is that of maximum entropy possible, a frontal inconsistency will not have been incurred because said intermediate equilibrium states have not affected the only real (the final).

The classical formulation defends that the change in entropy S is always greater or equal – exclusive for reversible processes – than the heat transfer Q produced divided by the equilibrium temperature T of the system

dS\geq\frac{\delta Q}{T} \!

General description

The axiomatic statement of the second principle immediately reveals its main characteristic: it is one of the few ontological laws of Physics, as it distinguishes, in a general way, those processes and physical states that are possible from those that are not ; that is, the second principle allows to determine the possibility of a process or state. In fact, in a historical sense the second principle arose, in the middle of the Industrial Revolution in the context of thermal machines as an empirical explanation of why they behaved in a certain way and not in another. Indeed, although it seems trivial, it was always observed, for example, that to heat a boiler it was necessary to use fuel burning at a higher temperature than that of the boiler; however, it was never observed that the boiler was heated taking energy from its surroundings, which in turn would cool. In fact, it could be reasoned that, by virtue of the first principle of thermodynamics, nothing prevents that, spontaneously, it is possible to extract heat from a cold body, for example at 200K, to transmit it to another hot one, for example at 1000K: it is enough that the corresponding energy balance is fulfilled, as a result of which the cold body would cool even more, and the hot one would get even hotter. However, all this is contrary to all experience; and although it seems common and even trivial, it had an extraordinary impact on the machines used in the Industrial Revolution: for example, if it had not been so, the machines could operate without needing fuel, since the necessary energy could be transferred spontaneously from the rest of the ambient. However, thermal machines seemed to obey a certain law, which materialized in the second principle: to produce mechanical work, it was necessary to provide additional energy (fuel), which in turn was always greater than the amount of work extracted. The concept of thermal machine thus appears closely linked to the initial statement of the second principle.

A thermal machine is one that provides effective work thanks to the difference in temperatures between two bodies. Since any thermodynamic machine requires a temperature difference, it follows that no useful work can be extracted from an insulated system in thermal equilibrium, that is, it will require the external power supply. This empirical principle, extracted from the continuous observation of how the universe works, constitutes one of the first statements of the Second Principle of Thermodynamics: any cyclic process whose only result is the absorption of energy in the form of heat from a thermal focus is impossible. (or reservoir or thermal deposit), and the conversion of all this energy in the form of heat into energy in the form of work.

Classic statements

The second law of thermodynamics has been expressed in many different ways. Briefly, classical thermodynamics has expressed it like this:

“A process whose only result is the transfer of energy in the form of heat from a lower temperature body to a higher temperature one is impossible.”

Clausius statement.

“It is impossible any cyclical process whose only result is the absorption of energy in the form of heat from a thermal source (or reservoir or thermal deposit), and the conversion of all this energy in the form of heat into energy in the form of work.”

Kelvin-Planck statement.

“For any potentially cyclic system, a single heat transfer is impossible such that this process is reciprocal and eventually reversible.”

Statement of John De Saint.

Some corollaries of the principle, sometimes used as alternative statements, would be:

“No cyclic process is such that the system in which it occurs and its environment can return at the same time to the same state from which they started.”

“In an isolated system, no process can occur if a decrease in the total entropy of the system is associated with it.”

Corollary of the principle, due to Clausius.

Visually, the second principle can be expressed by imagining a steam ship boiler. This could not produce work if it were not because the steam is at high temperatures and pressure compared to the surrounding environment.

Mathematically, it is expressed like this:

\frac{dS}{dt}\geq 0 \qquad \mbox{(1)}

where S is entropy and the symbol of equality only exists when entropy is at its maximum value (in equilibrium).

Entropy in statistical mechanics

Thermodynamics offers no physical interpretation of what entropy is: it simply defines it as a mathematical function that takes its maximum value for each state of equilibrium. The usual identification of entropy with molecular disorder comes from a very simplified interpretation of the results of statistical mechanics; specifically, the so-called microcanonical formalism of statistical mechanics. It is important to emphasize that thermodynamics and statistical mechanics, although related, are separate branches of physics.

Microcanonic interpretation of entropy based on the second principle of thermodynamics

The fundamental equation of a closed thermodynamic system in equilibrium can be expressed as

S = S ( U , V , N_1 , N_2 ,..., N_r ) \,

Where S represents the entropy of the system – from a thermodynamic point of view -, U the internal energy of the system, and N1, N2, etc. the number of moles of each system component. All these magnitudes are macroscopic, in the sense that they are expressed and can be measured and calculated without considering the microscopic nature (that is, of atoms, molecules, etc.), which make up the thermodynamic system.

Intuitively, it may seem reasonable to assume that if the system is in equilibrium, then its most fundamental components, its atoms and molecules, are also in balance. However, a fundamental result of quantum mechanics states that if the system is macroscopic, then there can be many discrete quantum states for its atoms and molecules that, globally, are compatible with the values ​​of U, V and N1, N2,… of the macroscopic system. In principle, however, even if there is that potential capacity of the microscopic components of the system to move from one quantum state to another, as the system is closed and in equilibrium, it could be reasoned that such transitions are not going to occur.

Now, in reality, there is no perfect isolated system. For example, even if we are able to thermally isolate the system in an absolute manner, we cannot avoid the gravitational effects that the rest of the universe will continue to exert on the matter that we have locked inside; Nor can it be perfectly isolated from all the electromagnetic fields that surround it, however weak they may be. In short, the system may be closed for macroscopic purposes, but the action of all types of force fields (be they gravity, electrical,…) and the system’s own interaction with the walls that enclose it will mean that at least From a microscopic point of view, the system is not in equilibrium: atoms and molecules will be subject to continuous transitions from one quantum state to another whose causes are, for all intents and purposes, merely random, of so many and so indefinite they can be.

Statistical mechanics considers that a macroscopic system makes extremely fast and totally random transitions between the different quantum states that are possible, so that macroscopic measurements of parameters such as temperature, energy, even volume,… are actually the average of the myriads of quantum or microscopic states. And since such transitions are produced by essentially random processes, it is accepted as a principle that a macroscopic system visits all permissible microscopic states with equal probability. These permissible microscopic states are called microstates.

For each macroscopic state of equilibrium, the number of microstates allowed is one determined by the laws of physics. For example, if a macroscopic system has 1000 joules of energy, it is absurd to assume that a microstate of said system can have more than 1000 joules of energy.

If a macroscopic equilibrium state is considered, according to the second principle of thermodynamics it will be fully defined by the values ​​of the thermodynamic variables U, V, N1, N2, etc. for which the entropy S takes its maximum value among all possible. Suppose we have an equilibrium thermodynamic system that is defined by a fundamental limitation: the system is not allowed to have a volume greater than a specific one, and the amount of matter in the system is what was given at the beginning. For example, gas in a gas cylinder: it cannot have a volume greater than that of the cylinder, nor can there be more gas than what has been placed inside. In response to this limitation of volume and mass, the system will acquire U-values ​​that maximize entropy, and then it will have reached macroscopic equilibrium. Associated with this macroscopic state of equilibrium, we have that of the microstates: the molecules of the system may present random transitions between different microstates within the limits imposed by the system itself. They cannot, for example, move beyond the system barriers, nor can they vibrate with an energy greater than the total energy of the macroscopic system, etc. That is, associated with the macroscopic balance there is a limited, although possibly immense, number of microstates that the microscopic constituents of the system can visit with equal probability.

If we now remove a restriction to the macroscopic system, such as allowing the volume to be now greater than before, two things will happen:

From the point of view of thermodynamics, that is, from the macroscopic point of view, the system variables will evolve towards a greater entropy state: volume V is now greater than before, and although the amount of matter is the same, This one can now occupy more volume. Thus, the internal energy of the system U will vary so that, in the new state of equilibrium, the entropy S takes on the maximum possible value. This value is necessarily greater than that of the prior equilibrium state. In fact, we can conceive the situation in which, although it can, the system remains in its previous volume, with the same internal energy and the same matter. In that case, the entropy will not have changed. And that case is compatible with the system limits. However, we know that nature does not operate like this: the system will tend to occupy the entire volume (even if it is a solid, in which case the vapor pressure of the solid will change, or will evaporate more solid, etc.), and the equilibrium will shift . The entropy function is that mathematical function that takes its maximum value in that new equilibrium, and must therefore be greater than in the previous equilibrium state.

From the microscopic point of view, it happens that now the number of microstates that are compatible with the limits of the system has increased. In effect, we will continue to have the same as before, but these are added new ones. For example, now an atom can move not only within the previous volume, but also within the entire new volume.

Thus, while the entropy increases there is an increase in the number of possible microstates. This suggests that entropy can be identified with the number of microstates consistent with the macroscopic limitations imposed on the system. Being the microstates the product of chance, and being the probability that each of them will be the same, it is natural to identify therefore entropy with microscopic disorder.

There is only one problem: according to thermodynamics, entropy is additive. That is, the entropy of two equal systems is twice that of the individual entropy of each of them. However, the number of possible microstates is multiplicative. That is, the number of microstates of two systems is the product of the number of microstates of each of them. For example, the number of “microstates” of two dice, if each of them is 6(each face of the dice is a possible microstate), is 6×6=36 microstates (having a “1” in the first, a “3”in the second; a”2″in the first, a”5″in the second, etc). To interpret entropy we will need to ensure that the number of microstates complies with an additive rule.

The only solution to this is to identify entropy with the logarithm of the number of possible microstates. Calling ? to the number of microstates and S to entropy, we can write that:

 S = k_b \ln \Omega \,

Where kB is the Boltzmann constant, and appears simply to determine the scale of entropy, which is usually given as energy by temperature (J / K), although according to this interpretation it could lack units.

Canonical interpretation

The micro-canonical interpretation of entropy conceives an isolated thermodynamic system, that is, a thermodynamic system that does not exchange neither matter nor energy nor volume with the outside: the composition of the system, given by N1, N2,…, its internal energy U and its volume V does not change in it. The par excellence system that meets these conditions is the universe itself. However, in many cases systems are contemplated that do exchange energy, mass or volume with their surroundings.

For these cases, it is necessary to extend the statistical interpretations of entropy, although globally it is the micro-canonical interpretation that endures. Indeed, if we consider a system that for example exchanges matter with its surroundings, we can conceive of a larger system that includes the initial system and its environment so that the global system adapts to the micro-canonical interpretation; In the limit, this system will be the universe itself. And it is precisely the entropy of the microcanonical system that is subject to the second principle of thermodynamics, that is, that which must be increased by varying the overall balance of the system.

Obviously, one could then think that any system, whatever the conditions of exchange with its environment, can be treated by conceiving the global system that is subject to micro-canonical interpretation. Indeed, in principle its equilibrium state should be obtainable without further consideration of the total number of microstates in the global system. This, however, can be very expensive if not practically impossible to estimate in most circumstances: the combinatorial calculations on the number of ways in which the energy available in a system can be distributed is usually beyond all mathematical knowledge.. And it is to solve those deficiencies that arise from the other interpretations of entropy.

The canonical interpretation, sometimes called canonical or Helmholtz formalism, considers a thermodynamic system capable of exchanging energy with a thermal reservoir or thermostat. According to this, by having an infinite source of energy, every energy state, from the lowest energy to the highest, will be conceivable for the system. However, as opposed to the microcanonical system, the probability of each of these states will not be the same: the system will not be the same fraction of time in each of those states. The central point of canonical formalism is to determine the probability distribution of the microstates. And this problem is solved taking into account that the global system formed by the thermostat and the system in question is a closed system, that is, it fulfills the microcanonical formalism that the probability of each global microstate is the same.

If the total energy of the global system is Etot, and that of a microstate of the local system is Ex, since the local system is in a state of energy Ex, the thermostat will inevitably be reduced to one of Etot energy – Ex. The probability that the global system is in a microstate such that the thermostat has Etot – Ej energy and the local system Ej will then be:

P_j = \frac{ \Omega_{termostato} (E_{tot} - E_j)}{\Omega_{tot} E_{tot} }

Following the definition of entropy according to Boltzmann, this equation can be written as:

P_j = \frac{ e^{ \frac{ S_{termostato} }{ k_B }(E_{tot} - E_j) } } { e^{ \frac{ S_{tot} }{k_B} E_{tot} } } \,

The internal energy U will be the average energy value of the local system, so, as entropy is additive, it can be written that:

S_{tot}(E_{tot}) = S(U)+ S_{termostato} (E_{tot} - U )\,
S_{termostato}(E_{tot}-E_j) \,

If it develops in series    we will have to:

S_{termostato}(E_{tot}-E_j) = S_{termostato}(E_{tot} - U + U - E_j) = S_{termostato}(E_{tot} - E ) + \frac{(U - E_j)}{T}

In this way, the probability can be expressed as:

P_j = \frac{ e^{ \frac{ U-T S(U)}{ k_B } } } { e^{ \frac{ T }{k_B} E_j } } \,

F = U-T S(U) \,

And since  , is the free energy of Helmholtz, we can express this probability as:

The total probability of being in any of these states is unity, so:

, from where it is defined

Z is the so-called canonical partition function, generally defined as:

If the partition function Z is known for a thermal equilibrium system, the entropy can be calculated by:

Where kB is the Boltzmann constant, T the temperature and the probabilities Pj.

This is the interpretation of entropy, called canonical interpretation or entropy of Helmholtz.

Von Neumann entropy in quantum mechanics

In the nineteenth century the concept of entropy was applied to systems formed by many particles that behave classically, at the beginning of the twentieth century Von Neumann generalized the concept of entropy for quantum particle systems, defining for a mixed state characterized by a density matrix? Von Neumann’s quantum entropy as the scalar magnitude:

Generalized entropy in General Relativity

The attempt to extend the conventional thermodynamic analysis to the entire universe, led to examine in the early 1970s the thermodynamic behavior of structures such as black holes. The preliminary result of this analysis revealed something very interesting, that the second law as it had been conventionally formulated for classical and quantum systems could be violated in the presence of black holes. However, the works of Jacob D. Bekenstein on information theory and black holes suggested that the second law would still be valid if a generalized entropy (Sgen) was introduced that added to conventional entropy (Sconv), the entropy attributable to the black holes that depends on the total area (A) of black holes in the universe. Specifically, this generalized entropy must be defined as:

Where, k is the Boltzmann constant, c is the speed of light, G is the universal gravitation constant and \ hbar is the rationalized Planck constant.

THERMODYNAMICS AND THE CONCEPT OF ENTROPY

Definition of some concepts.

Thermodynamics, to define it in a very simple way, fixes its attention inside the physical systems, in the exchanges of energy in the form of heat that are carried out between one system and another. The macroscopic magnitudes that relate to the internal state of a system are called thermodynamic coordinates; these will help us determine the internal energy of the system. In summary, the ultimate goal of thermodynamics is to find general relationships consistent with the basic principles of physics between thermodynamic coordinates (remember the principle of energy conservation that we discuss in number 3 of “Social Horizon).

Thermodynamics bases its analysis on some laws: The “zero” Law, referring to the concept of temperature, the First Law of Thermodynamics, which tells us about the principle of conservation of energy, the Second Law of Thermodynamics, which defines us to entropy Next we are going to talk about each of these laws, emphasizing the second law and the concept of entropy.

The zero law

The Zero Law of Thermodynamics tells us that if we have two bodies called A and B, with different temperatures from each other, and put them in contact, in a given time t, they will reach the same temperature, that is, they will both have the same temperature. If then a third body, which we will call C, contacts A and B, it will also reach the same temperature and, therefore, A, B and C will have the same temperature while in contact.

From this principle we can induce temperature, which is a condition that each body has and that man has learned to measure by arbitrary systems and reference scales (thermometric scales).

The First Law

The First Law of Thermodynamics refers to the concept of internal energy, work and heat. It tells us that if on a system with a certain internal energy, a work is carried out through a process, the internal energy of the system will vary. The difference in the internal energy of the system and the amount of work we call heat. Heat is the energy transferred to the system by non-mechanical means. Let us think that our system is a metal container with water; we can raise the temperature of the water by friction with a spoon or by direct heating in a lighter; In the first case, we are doing a job on the system and in the second we transmit heat.

It should be clarified that the internal energy of a system, work and heat are nothing more than different manifestations of energy. That is why energy is not created or destroyed, but, during a process, it only transforms into its various manifestations.

The Second Law

Finally, let’s see the content of the second law of thermodynamics. In more or less simple terms, I would say the following:”There is no process whose only result is the absorption of heat from a source and the complete formtion of this heat into work.” This principle (Kelvin-Planck principle) was born from the study of machine performance and their technological improvement. If this principle were not true, a thermal power plant could be operated by taking heat from the environment; apparently there would be no contradiction, because the environment contains a certain amount of internal energy, but we must point out two things: first, the second law of thermodynamics is not a consequence of the first, but an independent law; Second, the second law tells us about the restrictions that exist when using energy in different processes, in our case, in a thermal power plant. There is no machine that uses internal energy from a single heat source.

The concept of entropy was first introduced by Rudolf Clausius in the mid-19th century. Clausius, a French engineer, also formulated a principle for the Second Law:”No process is possible whose only result is the transfer of heat from a cold body to a hotter body.” supported this principle, Clausius introduced the thought of entropy,which is a measure of the amount of restrictions that exist for a process to take place and also determines the direction of that process. Let us now talk about the three most important meanings of the word entropy.

Entropy, disorder and degree of organization

Let’s imagine that we have a box with three divisions; Inside the box and in each division there are three different types of marbles: blue, yellow and red, respectively. The divisions are movable so I decide to remove the first one, the one that separates the blue marbles from the yellow ones. What I am doing in the entropy point of view is to remove a degree or index of restriction to my system; before I removed the first division, the marbles were separated and arranged in colors: in the first division the blue ones, in the second the yellow ones and in the third the red ones, were restricted to a certain order.

By removing the second division, I am also removing another degree of restriction. The marbles have mixed with each other in such a way that now I cannot have them ordered because the barriers that restricted them have been removed.

The entropy of this system has increased by removing the restrictions as initially there was an established order and at the end of the process (the process is in this case removing the divisions from the box) there is no order within the box.

Entropy is in this case a measure of the order (or disorder) of a system or the lack of degrees of restriction; The way to use it is to measure it in our initial system, that is, before removing any restrictions, and measure it again at the end of the process that the system underwent.

It is important to note that entropy is not defined as an absolute quantity S (symbol of entropy), but what can be measured is the difference between the initial entropy of a system Si and the final entropy of the same Sf. It makes no sense to speak of entropy except in terms of a change in the conditions of a system.

Entropy, reversible processes and irreversible processes

Returning to the previous example of the box with separations and marbles, we will explain what is a reversible process and what a non-reversible process.

We call reversible process which can be reversed and leave our system in the same initial conditions. Taking into account our box without the separations, we have the marbles scrambled with each other, that is, without an order. If the process we did to remove the divisions were reversible, the marbles would have to be spontaneously ordered in blue, yellow and red, according to the order of the divisions. This will not happen.

The process we carried out with our box of marbles was a non-reversible process, where once finished, the order that was in the initial conditions of the system will never be established again. The study of these types of processes is important because in nature all processes are irreversible.

Entropy and “spent” energy

In the principle enunciated by Clausius that we mentioned earlier, we can find the relationship with entropy and the energy released in a process. Think of an engine. The engine needs a power source to be able to turn it into work. If we think of a car, gasoline, together with the engine spark system, provides combustion (chemical) energy, capable of making the car move. What does entropy have to do here?

The energy that the car “used” to perform work and move, was “spent”, that is, energy released through a chemical process that is no longer usable for an engine to produce work.

This is one of the most difficult concepts to understand about entropy, as it requires a slightly less trivial knowledge of the operation of engines, refrigerators and the Carnot cycle. But for our purposes this explanation is enough.

What is entropy for?

Entropy, as a measure of the degree of restriction or as a measure of the disorder of a system, or in engineering, as an auxiliary concept in the problems of energy performance of machines, is one of the most important thermodynamic variables. His relationship with chaos theory opens a new field of study and research to this “groped” concept.

Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Finally You Are At The Right Place!Give us a chance to prove ourselves

Do You Know 4 Million Peoples Per Month Visit Widely Explore

So Don't Miss The Chance To Be The Part of Widi Community

And Stay Updated with the latest updates, world-class researches, and analysis

                                     

Close
Close