People tend to think of capitalism only in economic terms. Karl Marx argued that capitalism is a political and economic system that turns human productive labor to profit and income for those who own the means of production. Defenders of capitalism argue that capitalism is an economic system that promotes the free market and individual freedom. Both opponents and defenders of capitalism often measure the impact of capitalism from the standpoint of wealth and income, wages and prices, supply and demand.
However, the economy of mankind is a complex biophysical systems that interact with the wider natural world, and nothing can be studied fully in isolation from material conditions. Studying some fundamental concept in physics, we can better understand how all of the economic system, including the causes of the harmful activities of capitalism on humanity and the planet.
In this article we explain how the fundamental features of our natural and economic existence depends on the principles of thermodynamics in which we study the relationship between values, such as energy, work and heat. An understanding of how capitalism works on a physical level will help us to understand why our future economic system needs to be more environmentally friendly, which determines the priority of long-term stability and compatibility with the global ecosphere that supports humanity.
To understand, you will need to look at some basic concepts of physics. These include energy, entropy, dissipation and the various laws of nature that bind them together. The Central features of our natural existence as living organisms and humans, arise from collective interactions described by these basic physical laws. Although these concepts may be difficult to define without reference to specific models and theories, their common features can be outlined and analysed to reveal the powerful intersection between physics and Economics.
The exchange of energy between the different systems has a decisive influence on the order phase and the stability of physical matter. Energy is defined as any persistent physical property, which can produce the movement, such as work or heat exchange between different systems. Kinetic energy and potential energy are two of the most important forms of energy storage. The sum of these two quantities is known as mechanical energy. Truck speeding on the highway, accumulates kinetic energy, i.e. the energy associated with motion. Boulder hovering on the edge of a cliff has more potential energy, or energy associated with the position. If you give a little push, its potential energy is converted into kinetic energy under the influence of gravity. When physical systems interact, the energy is converted into many different forms, but the total amount always remains constant. Conservation of energy means that the total output of all energy flows and conversions must be equal to the total input.
The exchange of energy between different systems is a kind of space engine and it’s happening everywhere, and we hardly notice this process. Heat naturally flows from warmer to cooler, that is why our coffee is getting cold in the morning. Particles move from areas of high pressure to areas of low pressure, and therefore the wind starts howling. Water moves from regions of high potential energy to low potential energy, causing the river to flow. Electric charges move from regions of high voltage in the low voltage, and thus current passes through the conductors. The flow of energy through a physical system is one of the most common features of nature, and, as these examples show, the energy flows require a gradient, the differences in temperature, pressure, density or other factors. Without these gradients in nature there would be no exchange, all physical systems would remain in equilibrium, and the world would be inert and very boring. Energy flows are also important because they can generate mechanical work, that is, any macroscopic movement responsive to the force. Lifting weights and kick – examples of the impact of mechanical work on another system. An important achievement of classical physics is to equate the amount of work to change the mechanical energy of the physical system, identifying the relationship between these two variables.
Although the flow of energy and can produce mechanical work, they seldom do so efficiently. Large macroscopic systems, such as trucks or planets, usually lose or get mechanical energy through their interaction with the outside world. The main thing in this case is the process of dissipation is a process that partially reduces or completely eliminates the mechanical energy of a physical system and convert it into heat or radiation. When they interact with the external environment, a physical system often lose mechanical energy in time, through friction, diffusion, turbulence, vibration, collision, and other similar dissipative effects, all of which stand in the way of the complete conversion of energy source into mechanical work. A simple example is the dissipation of the heat generated when compressed quickly we hand. In the natural world of macroscopic energy flows are often accompanied by dissipative losses of one kind or another. A physical system that can dissipate energy, capable of rich and complex interactions, which makes the dissipation of the Central feature of the natural order. A world without dissipation and without interactions, which make dissipation possible, it is hard to imagine. If the friction suddenly disappeared from the world, people would just slip everywhere. Our machines would be useless as the idea of moving because of the wheels and other mechanical devices would have no grip with the ground and other surfaces. We will never be able to hold hands or to swing on the hands of children. Our bodies would quickly lose its internal structure. The world would be changed.
Dissipation is closely linked to entropy, one of the most important concepts in thermodynamics. While energy measures the movement created by the physical systems, the entropy keeps track of how the energy is distributed in the natural world. Entropy has several definitions in physics, and they are all essentially equivalent. In one popular definition of classical thermodynamics States that the entropy represents the amount of thermal energy per unit temperature that is unavailable for mechanical work in the process of thermodynamics. Another important definition refers to statistical physics, which deals with the possibility of connecting the microscopic parts of nature to obtain macroscopic results. In this statistical version of entropy is a measure of the various ways in which the microscopic state of a larger system can be rearranged without changing the system. For a specific example, consider a typical gas and a typical solid are in equilibrium. In these two phases of matter energy is distributed very differently. A gas has higher entropy than a solid because the particles of the first are much more possible configurations of energy than fixed atomic sites in solids and crystals which have only a small range of configurations of energy, preserving their fundamental right. We must emphasize that the concept of entropy does not apply to a particular configuration of a macroscopic substance, but rather applies as a limitation of the number of possible configurations that a macroscopic system can be in equilibrium.
Entropy has a deep connection with dissipation through one of the most important laws of thermodynamics – heat flows can never be completely converted to work. Dissipative interactions ensure that physical systems always lose some energy as heat in any natural thermodynamic process where there is friction and similar processes. Life examples of such thermodynamic losses, the emissions from automotive engines, electric currents, colliding with the resistance, and the interacting layers of any fluid is experiencing a viscosity. In thermodynamics, these phenomena often considered irreversible. Continuous production of thermal energy from irreversible phenomena gradually depletes the supply of mechanical energy, which can exploit a physical system. According to the definition of entropy, the exhaustion of the useful mechanical energy usually involves an increase in entropy. Formally stated that the most important consequence of any irreversible process is the increase in the combined entropy of the physical system and its environment. For an isolated system the entropy continues to increase until, until it reaches a maximum value, and at this point, the system settles in balance. To clarify this last concept, imagine the red gas and blue gas, is divided by a partition inside a sealed container. The removal of the partition allows the two gases to mix. The result is gas, which looks purple, and that the equilibrium configuration will represent the state of maximum entropy. We can also associate a dissipation with the concept of entropy in statistical physics. The distribution of thermal energy through a physical system reverses the movement of their molecules into something more random and dispersed, increasing the number of microstates that can represent the macroscopic system properties. In a broad sense entropy can be considered as a tendency of nature to reconfigure the energy States in a distribution, which dissipate mechanical energy.
The traditional description of entropy given above is applied in the regime of equilibrium thermodynamics. But in the real world physical systems rarely exist at fixed temperature in a perfect state of balance or in complete isolation from the rest of the universe. In the field of non-equilibrium thermodynamics, we study the properties of thermodynamic systems that are far enough from equilibrium, such as living organisms or explosive bombs. Nonequilibrium systems are a vital Foundation of the Universe; they make the world dynamic and unpredictable. Modern thermodynamics is still in development but it is already being used to successfully study a wide variety of phenomena, including heat flow, interacting quantum gases, dissipative structures, and even the global climate. There is no accepted meaning of entropy in nonequilibrium conditions, but physicists have proposed several solutions. They include analysis of the thermodynamic interactions that allows us to determine not only whether up or down entropy, but also how quickly or slowly physical system can change on its way to equilibrium. Therefore, the principles of modern thermodynamics is needed to help us understand the behavior of real systems, including life itself.
Central physical aim of all life is to avoid thermodynamic equilibrium with the rest of the environment, continuously dissipating energy. This definition invited physicist Erwin schrödinger in the 1940s, when he used nonequilibrium thermodynamics to study the key features of biology. We can call this vital goal entropic imperative. All living organisms consume energy from the external environment, using it to fuel the vital biochemical processes and interactions, and then scatter a large part of consumed energy back into the environment. The energy dissipation into the environment allows organisms to maintain the order and stability of their biochemical systems. Essential functions of life are critically dependent on the stability of this entropy, including functions such as digestion, respiration, cell division and protein synthesis. What makes life unique physical system – a huge variety of methods of dissipation, it has been developed, including heat production, gaseous emissions and hazardous waste. Extensive ability to dissipate energy that helps life to sustain the entropic imperative. Indeed, physicist Jeremy England argued that the physical system in a heat bath filled with a large amount of energy may have a tendency to dissipate more energy. This “dissipative adaptation” can lead to spontaneous emergence of order, replication and self-Assembly among the microscopic units of matter, providing a potential key in the dynamics of the origin of life. Organisms also use energy, which they consume to perform mechanical work, for example, walking, running, climbing or entering text on the keyboard. Those organisms that have access to many sources of energy can do more work and dissipate more energy to satisfy the Central conditions of life
The thermodynamic relationship between energy, entropy and dissipation also imposes powerful restrictions on the behavior and evolution of economic systems. The economy is a dynamic and emerging systems that have to operate in a certain way because of their social and environmental conditions. In this context, the economy represent a nonequilibrium system, able to quickly dissipate the energy in any environment. All dynamic systems are gaining strength from some reservoir of energy, reach peak intensity, consuming the regular amount of energy, and then diverge from internal and external changes that either violate the vital energy flows or does not allow you to continue to dissipate more energy. They may even experience prolonged fluctuations, growing for some time, then sivrisi, and then again growing up before you are destroyed. The interaction between dynamical systems can give chaotic results, but energy of decomposition and reduction are the main features of all dynamical systems. The energy consumed by all economic systems, or converted into mechanical work or physical products derived from this work, or is simply lost and is dissipated into the environment. We can determine the collective efficiency of the economic system as a proportion of all energy consumed that goes into the creation of mechanical work and electrical energy. Economy, which increase the amount of mechanical work they produce, can produce more goods and services. But no matter how it was important, mechanical work represents a relatively small proportion of the total use of energy in any economy. The vast majority of energy consumed by the economy, regularly goes into the environment through waste, dissipation and other energy losses.
Translated By Alexander Romanov
To be continued…
© 2018, z-news.xyz. All rights reserved