scispace - formally typeset
Search or ask a question

Showing papers in "Entropy in 2003"


Journal ArticleDOI
03 Jul 2003-Entropy
TL;DR: Relations between chirality, symmetry, and other concepts such as similarity, disorder and entropy, are discussed.
Abstract: Many quantitative measures of the degree of chirality or symmetry of a set have been proposed in the literature. The main approaches from various areas are reviewed: chemistry, physics, mathematics, computer sciences, biology, and psychophysics. Relations between chirality, symmetry, and other concepts such as similarity, disorder and entropy, are discussed.

119 citations


Journal ArticleDOI
05 Feb 2003-Entropy
TL;DR: The Second Law of Thermodynamics (the law of "increasing disorder") is fundamental to all the sciences and is considered inviolable by the general scientific community.
Abstract: The Second Law of Thermodynamics (the law of "increasing disorder") is fundamental to all the sciences and is considered inviolable by the general scientific community.[...]

55 citations


Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: It is shown that this extensive generalized statistics can be useful for the correlated electron systems in weak coupling regime.
Abstract: Statistical mechanics is generalized on the basis of an additive information theory for incomplete probability distributions. The incomplete normalization is used to obtain generalized entropy . The concomitant incomplete statistical mechanics is applied to some physical systems in order to show the effect of the incompleteness of information. It is shown that this extensive generalized statistics can be useful for the correlated electron systems in weak coupling regime.

49 citations


Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: The entropy generation during transient laminar natural convection in a square enclosure is numerically investigated and it is found that the active sites in the completely heated case are at the left bottom corner of the heated wall and the right top corners of the cooled wall at the same magnitudes.
Abstract: The entropy generation during transient laminar natural convection in a square enclosure is numerically investigated. Two different cases are considered. The enclosure is heated either completely or partially from the left side wall and cooled from the opposite side wall. The bottom and the top of the enclosure are assumed as insulated. The Boussinesq approximation is used in the natural convection modelling. The solutions are obtained from quiescent conditions proceeded through the transient up to the steady-state. The calculations are made for the Prandtl numbers 0.01 and 1.0 and Rayleigh numbers between 10 2 – 10 8 . The entropy generation and the active places triggering the entropy generation are obtained for each case after the flow and thermal characteristics are determined. It is found that the active sites in the completely heated case are at the left bottom corner of the heated wall and the right top corner of the cooled wall at the same magnitudes. In the case of partial heating, however, the active site is observed at the top corner of the heated section especially at lower Pr and Ra values. Keywords: Enclosure, laminar natural convection, entropy generation

47 citations


Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: Two dimensional numerical analysis of entropy generation during transient convective heat transfer for laminar flow between two parallel plates has been investigated and indicated that the number of the entropy generation has its highest value at the highest Reynolds and Br/Ω values, which is obtained at counter motion of the lower plate.
Abstract: Two dimensional numerical analysis of entropy generation during transient convective heat transfer for laminar flow between two parallel plates has been investigated. The fluid is incompressible and Newtonian and the flow is the hydrodynamically and thermally developing. The plates are held at constant equal temperatures higher than that of the fluid. The bottom plate moves in either parallel or in inverse direction to the flow. The governing equations of the transient convective heat transfer are written in two-dimensional Cartesian coordinates and solved by the finite volume method with SIMPLE algorithm. The solutions are carried for Reynolds numbers of 10 2 , 5x10 2 and 10 3 and Prandtl number of 1. After the flow field and the temperature distributions are obtained, the entropy values and the sites initiating the entropy generation are investigated. The results have indicated that the number of the entropy generation has its highest value at the highest Reynolds and Br/Ω values, which is obtained at counter motion of the lower plate. The lowest average number of the entropy generation on the bottom plate is obtained in parallel motion. The corners of the channel plates at the entrance play the role of active sites where the generation of entropy is triggered.

46 citations


Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: A numerical solution to the entropy generation in a circular pipe is made by integrating over the various cross-sections as well as over the entire volume of the fluid as it flows through the pipe.
Abstract: A numerical solution to the entropy generation in a circular pipe is made. Radial and axial variations are considered. Navier-Stokes equations in cylindrical coordinates are used to solve the velocity and temperature fields. Uniform wall heat flux is considered as the thermal boundary condition. The distribution of the entropy generation rate is investigated throughout the volume of the fluid as it flows through the pipe. Engine oil is selected as the working fluid. In addition, water and Freon are used in a parametric study. The total entropy generation rate is calculated by integration over the various cross-sections as well as over the entire volume.

45 citations


Journal ArticleDOI
26 Jun 2003-Entropy
TL;DR: How to tighten the relative terms in the characterizations of 'information system' and 'informational hierarchy' is addressed, how to distinguish between components of an information system combining to form more complex informational modules and the expression of information, and why information expression in such systems is not sufficient for functional information.
Abstract: A system of a number of relatively stable units that can combine more or less freely to form somewhat less stable structures has a capacity to carry information in a more or less arbitrary way. I call such a system a physical information system if its properties are dynamically specified. All physical information systems have certain general dynamical properties. DNA can form such a system, but so can, to a lesser degree, RNA, proteins, cells and cellular subsystems, various immune system elements, organisms in populations and in ecosystems, as well as other higher-level phenomena. These systems are hierarchical structures with respect to the expression of lower level information at higher levels. This allows a distinction between macro and microstates within the system, with resulting statistical (entropy driven) dynamics, including the possibility of self-organization, system bifurcation, and the formation of higher levels of information expression. Although lower-level information is expressed in an information hierarchy, this in itself is not sufficient for reference, function, or meaning. Nonetheless, the expression of information is central to the realization of all of these. 'Biological information' is thus ambiguous between syntactic information in a hierarchical modular system, and functional information. However, the dynamics of hierarchical physical information systems is of interest to the study of how functional information might be embodied physically. I will address 1) how to tighten the relative terms in the characterizations of 'information system' and 'informational hierarchy' above, 2) how to distinguish between components of an information system combining to form more complex informational modules and the expression of information, 3) some aspects of the dynamics of such systems that are of biological interest, 4) why information expression in such systems is not sufficient for functional information, and 5) what further might be required for functional information.

40 citations


Journal ArticleDOI
05 Feb 2003-Entropy
TL;DR: The meaning of “information” is introduced in a way that is detached from human technological systems and related algorithms and semantics, and that is not based on any mathematical formula.
Abstract: In this article we address some fundamental questions concerning information: Can the existing laws of physics adequately deal with the most striking property of information, namely to cause specific changes in the structure and energy flows of a complex system, without the information in itself representing fields, forces or energy in any of their characteristic forms? Or is information irreducible to the laws of physics and chemistry? Are information and complexity related concepts? Does the Universe, in its evolution, constantly generate new information? Or are information and information-processing exclusive attributes of living systems, related to the very definition of life? If that were the case, what happens with the physical meanings of entropy in statistical mechanics or wave function in quantum mechanics? How many distinct classes of information and information processing do exist in the biological world? How does information appear in Darwinian evolution? Does the human brain have unique properties or capabilities in terms of information processing? In what ways does information processing bring about human self-consciousness? We shall introduce the meaning of "information" in a way that is detached from human technological systems and related algorithms and semantics, and that is not based on any mathematical formula. To accomplish this we turn to the concept of interaction as the basic departing point, and identify two fundamentally different classes, with information and information-processing appearing as the key discriminator: force-field driven interactions between elementary particles and ensembles of particles in the macroscopic physical domain, and information-based interactions between certain kinds of complex systems that form the biological domain. We shall show that in an abiotic world, information plays no role; physical interactions just happen, they are driven by energy exchange between the interacting parts and do not require any operations of information processing. Information only enters the non-living physical world when a living thing interacts with it-and when a scientist extracts information through observation and measurement. But for living organisms, information is the very essence of their existence: to maintain a long-term state of unstable thermodynamic equilibrium with its surroundings, consistently increase its organization and reproduce, an organism has to rely on information-based interactions in which form or pattern, not energy, is the controlling factor. This latter class comprises biomolecular information processes controlling the metabolism, growth, multiplication and differentiation of cells, and neural information processes controlling animal behavior and intelligence. The only way new information can appear is through the process of biological evolution and, in the short term, through sensory acquisition and the manipulation of images in the nervous system. Non-living informational systems such as books, computers, AI systems and other artifacts, as well as living organisms that are the result of breeding or cloning, are planned by human beings and will not be considered here.

39 citations


Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: It is proposed here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint.
Abstract: We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.

37 citations


Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called general theory of information.
Abstract: A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological) and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by Shannon’s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.

31 citations


Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: An ecological optimization along with a detailed parametric study of an irreversible regenerative Brayton heat engine with isothermal heat addition and the effects of turbine efficiency are found to be more than those of the compressor efficiency on all the performance parameters of the cycle.
Abstract: An ecological optimizationalong with detailed parametric study of ana irreversible regenerative Brayton heat engine with isothermal heat addition have been carried out with external as well as internal irreversibilities. The ecological function is defined as the power output minus the power loss (irreversibility) which is ambient temperature times the entropy generation rate. The external irreversibility is due to finite temperature difference between the heat engine and the external reservoirs while the internal irreversibilities are due to nonisentropic compression and expansion processes in the compressor and the turbine respectively and the regenerative heat loss. The ecological function is found to be an increasing function of the isothermal-, sink- and regenerative-side effectiveness, isothermal-side inlet temperature, component efficiencies and sink-side temperature while it is found to be a decreasing function of the isobaric-side temperature and effectiveness and the working fluid heat capacitance rate. The effects of the isobaric-side effectiveness are found to be more thanthose of the other parameters and the effects of turbine efficiency are found to be more than those of the compressor efficiency on all the performance parameters of the cycle.

Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: This article studies laughter just as an auditive signal (as a 'neutral' information carrier), and it is compared with the regular traits of linguistic signals, to numerically gauge the disorder content of laughter frequencies.
Abstract: Laughter is one of the most characteristic -and enigmatic- communicational traits of human individuals. Its analysis has to take into account a variety of emotional, social, cognitive, and communicational factors densely interconnected. In this article we study laughter just as an auditive signal (as a 'neutral' information carrier), and we compare its structure with the regular traits of linguistic signals. In the experimental records of human laughter that we have performed, the most noticeable trait is the disorder content of frequencies. In comparison with the sonograms of vowels, the information content of which appears as a characteristic, regular function of the first vibration modes of the dynamic system formed, for each vowel, by the vocal cords and the accompanying resonance of the vocalization apparatus, the sonograms of laughter are highly irregular. In the episodes of laughter, a highly random content in frequencies appears, reason why it cannot be considered as a genuine codification of patterned information like in linguistic signals. In order to numerically gauge the disorder content of laughter frequencies, we have performed several "entropy" measures of the spectra -trying to unambiguously identify spontaneous laughter from "faked", articulated laughter. Interestingly, Shannon's entropy (the most natural candidate) performs rather poorly.

Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: This paper criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful.
Abstract: There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information.

Journal ArticleDOI
Feng Wu1, Chih Wu, Fangzhong Guo, Qing Li, Lingen Chen 
31 Dec 2003-Entropy
TL;DR: A generalized heat transfer model using a complex heat transfer exponent is presented and the Optimization zone on the performance of the thermoacoustic heat engine is obtained.
Abstract: Heat transfer between a thermoacoustic engine and its surrounding heat reservoirs can be out of phase with oscillating working gas temperature. The paper presents a generalized heat transfer model using a complex heat transfer exponent. Both the real part and the imaginary part of the heat transfer exponent change the power versus efficiency relationship quantitatively. When the real part of the heat transfer exponent is fixed, the power output P decreases and the efficiency increases along with increasing of the imaginary part. The Optimization zone on the performance of the thermoacoustic heat engine is obtained. The results obtained will be helpful for the further understanding and the selection of the optimal operating mode of the thermoacoustic heat engine. Keywords: Thermoacoustic engine, Complex heat transfer exponent, Optimization zone __________________________________________________________________________________ Introduction A thermoacoustic engine (prime mover and refrigerator) [1-4] is of the advantages of high reliability, low noise, simple construction, non-parts of motion, non-pollution, ability to self-start etc. It can utilize a wide variety of energy resources: solar, geothermal, industrial waste heat and marsh gas. It has very important significance for environmental protection and moderating the tense petroleum needs in the world. With this great potential, the thermoacoustic engine has captivated many engineers

Journal ArticleDOI
05 Feb 2003-Entropy
TL;DR: It is argued that the individual is a social, self-conscious, creative, reflective, cultural, symbol- and language-using, active natural, producing, labouring, objective, corporeal, living, real, sensuous, visionary, imaginative, designing, co-operative being that makes its own history and can strive towards freedom and autonomy.
Abstract: The aim of this paper is to point out which role the individual plays in the generation of information in social systems. First, it is argued that the individual is a social, self-conscious, creative, reflective, cultural, symbol- and language-using, active natural, producing, labouring, objective, corporeal, living, real, sensuous, visionary, imaginative, designing, co-operative being that makes its own history and can strive towards freedom and autonomy. Based on these assumptions the re-creation/self-organisation of social systems is described as a dialectic of actions and social structures and as a dialectic of individual information and social information. The individual enters economic, political and cultural relationships that result in the emergence and differentiation of social (i.e. economic, political and cultural) information which enables and constrains individual actions and thinking. Individuals as actors in social systems are indispensable for social self-organisation.

Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: The measure-theoretic entropy of the additive one-dimensional cellular automata f∞ with respect to μ is equal to hμ (f∞) = 2klog r, where k ≥ 1, r-1∈S.
Abstract: We show that for an additive one-dimensional cellular automata on space of all doubly infinitive sequences with values in a finite set S = {0, 1, 2, ..., r-1}, determined by an additive automaton rule f (x f ∞ n-k , ..., x n+k ) = (mod r), and a -invariant uniform Bernoulli measure µ, the measure-theoretic entropy of the additive one-dimensional cellular automata with respect to µ is equal to h ∑ =−+ i k x n k f ∞ k ff ∞ f ∞ µ ( ) = 2klog r, where k ≥ 1, r-1∈S. We also show that the uniform Bernoulli measure is a measure of maximal entropy for additive one-dimensional cellular automata ∞ . Keywords: Cellular Automata, Measure Entropy, Topological Entropy. Introduction Although the additive cellular automata theory and the entropy of this additive cellular automata have grown up somewhat independently, there are strong connections between entropy theory and cellular automata theory. We give an introduction to additive cellular automata theory and then discuss the entropy of this additive cellular automata. In [1] it was given an algorithm for computing the topological entropy of positively expansive cellular automata. For a definition and some properties of additive one-dimensional cellular automata we refer to [2] (see also [1-6]). The study of the endomorphisms and the automorphisms (i.e. continuous shift commuting maps, invertible or non-invertible) of the full shift and its subshifts was initialed by Hedlund and coworkers in [3].

Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: The nature of heat transfer and entropy generation for natural convection in a two-dimensional circular section enclosure is investigated in terms of Nusselt number, entropy generation number, and Bejan number.
Abstract: We investigate the nature of heat transfer and entropy generation for natural convection in a two-dimensional circular section enclosure. The enclosure is assumed to fill with porous media. The Darcy momentum equation is used to model the porous media. The full governing differential equations are simplified with the Boussinesq approximation and solved by a finite volume method. Whereas the Prandtl number Pr is fixed to 1.0. Results are presented in terms of Nusselt number, entropy generation number, and Bejan number.

Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework and that C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.
Abstract: It is argued that a true transdisciplinary information science going from physical information to phenomenological understanding needs a metaphysical framework. Three different kinds of causality are implied: efficient, formal and final. And at least five different levels of existence are needed: 1. The quantum vacuum fields with entangled causation. 2. The physical level with is energy and force-based efficient causation. 3. The informational-chemical level with its formal causation based on pattern fitting. 4. The biological-semiotic level with its non-conscious final causation and 5. The social-linguistic level of self-consciousness with its conscious goal-oriented final causation. To integrate these consistently in an evolutionary theory as emergent levels, neither mechanical determinism nor complexity theory are sufficient because they cannot be a foundation for a theory of lived meaning. C. S. Peirce's triadic semiotic philosophy combined with a cybernetic and systemic view, like N. Luhmann's, could create the framework I call Cybersemiotics.


Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: Based on a model of a two-heat-reservoir heat engine with a finite high-temperature source and bypass heat leak, the optimal configuration of the cycle is found for the fixed cycle period with another linear heat transfer law.
Abstract: Based on a model of a two-heat-reservoir heat engine with a finite high-temperature source and bypass heat leak, the optimal configuration of the cycle is found for the fixed cycle period with another linear heat transfer law )

Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: The nature of entropy generation for natural convection in a two-dimensional square section enclosure vibrating sinusoidally perpendicular to the applied temperature gradient in a zero-gravity field is investigated.
Abstract: : We investigate the nature of entropy generation for natural convection in a two-dimensional square section enclosure vibrating sinusoidally perpendicular to the appliedtemperature gradient in a zero-gravity field. The enclosure is assumed to fill with porousmedia. The Darcy momentum equation is used to model the porous media. The fullgoverning differential equations are simplified with the Boussinesq approximation and solvedby a finite volume method. Whereas the Prandtl number Pr is fixed to 1.0. Results arepresented in terms of average Nusselt number ( Nu av ), entropy generation number ( Ns av ),Bejan number ( Be av ), and kinetic energy ( KE av ). Keywords : vibrational convection, entropy generation, porous media, Bejan number, squarecavity. Introduction Free convection heat transfer inside a square cavity (porous or non-porous) is a well-establishedproblem. During last two decades, a vast number of articles have been published in this field. For acomprehensive reference see Bejan [1]. Free convection inside a square cavity with gravity oscillationis a special class of above-mentioned problem. In low gravity or microgravity environments, we canexpect that reduction or elimination of natural convection may enhance the properties and performance

Journal ArticleDOI
15 Nov 2003-Entropy
TL;DR: The concept of minimum entropy as a paradigm of both Nash's equilibrium (maximum utility MU) and Hayek equilibrium (minimum entropy ME) is introduced.
Abstract: This paper introduces Hermite's polynomials, in the description of quantum games. Hermite's polynomials are associated with gaussian probability density. The gaussian probability density represents minimum dispersion. I introduce the concept of minimum entropy as a paradigm of both Nash's equilibrium (maximum utility MU) and Hayek equilibrium (minimum entropy ME). The ME concept is related to Quantum Games. Some questions arise after carrying out this exercise: i) What does Heisenberg's uncertainty principle represent in Game Theory and Time Series?, and ii) What do the postulates of Quantum Mechanics indicate in Game Theory and Economics?.

Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: The phase space volume Ω occupied by a nonextensive system of N classical particles described by an equilibrium distribution function, which slightly deviates from Maxwell-Boltzmann (MB) distribution in the high energy tail is calculated.
Abstract: We calculate the phase space volume Ω occupied by a nonextensive system of N classical particles described by an equilibrium (or steady-state, or long-term stationary state of a nonequilibrium system) distribution function, which slightly deviates from Maxwell-Boltzmann (MB) distribution in the high energy tail. We explicitly require that the number of accessible microstates does not change respect to the extensive MB case. We also derive, within a classical scheme, an analytical expression of the elementary cell that can be seen as a macrocell, different from the third power of Planck constant. Thermodynamic quantities like entropy, chemical potential and free energy of a classical ideal gas, depending on elementary cell, are evaluated. Considering the fractional deviation from MB distribution we can deduce a physical meaning of the nonextensive parameter q of the Tsallis nonextensive thermostatistics in terms of particle correlation functions (valid at least in the case, discussed in this work, of small deviations from MB standard case).

Journal ArticleDOI
30 Jan 2003-Entropy
TL;DR: This Introduction will briefly survey the problems around the concept of information, will present the central ideas of the FIS initiative, and will contrast some of the basic differences between information and mechanics (reductionism).
Abstract: The accompanying papers in the first issue of Entropy, volume 5, 2003 were presented at the electronic conference on Foundations of Information Science FIS 2002 (http://www.mdpi.net/fis2002/). The running title of this FIS e-conference was THE NATURE OF INFORMATION: CONCEPTIONS, MISCONCEPTIONS, AND PARADOXES. It was held on the Internet from 6 to 10 May 2002, and was followed by a series of discussions –structured as focused sessions– which took place in the net from 10 May 2002 until 31 January 2003 (more than 400 messages were exchanged, see: http://fis.iguw.tuwien.ac.at/mailings/). This Introduction will briefly survey the problems around the concept of information, will present the central ideas of the FIS initiative, and will contrast some of the basic differences between information and mechanics (reductionism).

Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: A new model for the droplet size distribution has been developed based on the thermodynamically consistent concept - the maximization of entropy generation during the liquid atomization process, which can be used to predict the initial droplets size distribution in sprays.
Abstract: The maximum entropy principle (MEP), which has been popular in the modeling of droplet size and velocity distribution in sprays, is, strictly speaking, only applicable for isolated systems in thermodynamic equilibrium; whereas the spray formation processes are irreversible and non-isolated with interaction between the atomizing liquid and its surrounding gas medium. In this study, a new model for the droplet size distribution has been developed based on the thermodynamically consistent concept - the maximization of entropy generation during the liquid atomization process. The model prediction compares favorably with the experimentally measured size distribution for droplets, near the liquid bulk breakup region, produced by an air-blast annular nozzle and a practical gas turbine nozzle. Therefore, the present model can be used to predict the initial droplet size distribution in sprays.

Journal ArticleDOI
16 Jul 2003-Entropy
TL;DR: In this paper, it was shown that the Clausius analysis leading to the law of increasing entropy does not follow from the given axioms but it can be proved that for irreversible transitions, the total entropy change of the system and thermal reservoirs (the universe) is not negative, even for the case when the reservoirs are not at the same temperature as the system during heat transfer.
Abstract: Recently, there have appeared interesting correctives or challenges [ Entropy 1999, 1 , 111-147] to the Second law formulations, especially in the interpretation of the Clausius equivalent transformations, closely related in area to extensions of the Clausius principle to irreversible processes [ Chem. Phys. Lett . 1988, 143 ( 1 ), 65-70]. Since the traditional formulations are central to science, a brief analysis of some of these newer theories along traditional lines is attempted, based on well-attested axioms which have formed the basis of equilibrium thermodynamics. It is deduced that the Clausius analysis leading to the law of increasing entropy does not follow from the given axioms but it can be proved that for irreversible transitions, the total entropy change of the system and thermal reservoirs (the “Universe”) is not negative, even for the case when the reservoirs are not at the same temperature as the system during heat transfer. On the basis of two new simple theorems and three corollaries derived for the correlation between irreversible and reversible pathways and the traditional axiomatics, it is shown that a sequence of reversible states can never be used to describe a corresponding sequence of irreversible states for at least closed systems, thereby restricting the principle of local equilibrium. It is further shown that some of the newer irreversible entropy forms given exhibit some paradoxical properties relative to the standard axiomatics. It is deduced that any reconciliation between the traditional approach and novel theories lie in creating a well defined set of axioms to which all theoretical developments should attempt to be based on unless proven not be useful, in which case there should be consensus in removing such axioms from theory. Clausius’ theory of equivalent transformations do not contradict the traditional

Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: It is found that increasing pipe wall temperature and Reynolds number increases the entropy production rate, in which case, entropy generation due to heat transfer dominates over that corresponding to fluid friction.
Abstract: In the present study, heat transfer and entropy analysis for flow through a pipe system is considered. The Reynolds number and the pipe wall temperature effects on entropy distribution and total entropy generation in the pipe are investigated. Numerical scheme employing a control volume approach is introduced when solving the governing equations. Steel is selected as pipe material, while water is used as fluid. It is found that increasing pipe wall temperature and Reynolds number increases the entropy production rate, in which case, entropy generation due to heat transfer dominates over that corresponding to fluid friction.

Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: Calculations show that the cost allocation by using the reduced exergy model is more rational and practical than those by using existing models in terms of embodying the physical meaning.
Abstract: Although the cost allocation method does not change the total benefits of CHP, the use of various cost allocation methods generally results in significant differences in costs allocated for CHP products. In order to overcome the inadequacy of existing cost allocating methods in theory and in practice, according to the different roles of anergy and exergy in heat supply process of CHP plant, the reduced exergy method for cost allocation is formulated by introducing the concepts of the available anergy and reduced exergy. The contribution of the available anergy is expressed with a user factor, which can reflect different utilization for different practical conditions. Some practical conditions for typical CHP units are computed and compared with existing methods. Calculations show that the cost allocation by using the reduced exergy model is more rational and practical than those by using existing models in terms of embodying the physical meaning.

Journal ArticleDOI
30 Jun 2003-Entropy
TL;DR: In this framework, enzymes are molecular automata of the extremal quantum computer, the set of which maintains stable highly ordered coherent state, and genome represents a concatenation of error-correcting codes into a single reflective set.
Abstract: The smallest details of living systems are molecular devices that operate between the classical and quantum levels, ie between the potential dimension (microscale) and the actual three-dimensional space (macroscale) They realize non-demolition quantum measurements in which time appears as a mesoscale dimension separating contradictory statements in the course of actualization These smaller devices form larger devices (macromolecular complexes), up to living body The quantum device possesses its own potential internal quantum state (IQS), which is maintained for prolonged time via error-correction being a reflection over this state Decoherence-free IQS can exhibit itself by a creative generation of iteration limits in the real world To avoid a collapse of the quantum information in the process of correcting errors, it is possible to make a partial measurement that extracts only the error-information and leaves the encoded state untouched In natural quantum computers, which are living systems, the error-correction is internal It is a result of reflection, given as a sort of a subjective process allotting optimal limits of iteration The IQS resembles the properties of a quasi-particle, which interacts with the surround, applying decoherence commands to it In this framework, enzymes are molecular automata of the extremal quantum computer, the set of which maintains stable highly ordered coherent state, and genome represents a concatenation of error-correcting codes into a single reflective set Biological systems, being autopoietic in physical space, control quantum measurements in the physical universe The biological evolution is really a functional evolution of measurement constraints in which limits of iteration are established possessing criteria of perfection and having selective values

Journal ArticleDOI
31 Dec 2003-Entropy
TL;DR: The analytical solutions for the temperature variation of two streams in parallel flow, counter flow and cross-flow heat exchangers and related entropy generation due to heat exchange between the streams are presented.
Abstract: The analytical solutions for the temperature variation of two streams in parallel flow, counter flow and cross-flow heat exchangers and related entropy generation due to heat exchange between the streams are presented. The analysis of limiting cases for the relative entropy generation is performed, and corresponding analytical expressions are given. The obtained results may be included in a more general procedure concerning optimal heat exchanger design.