As a concept matter comes earlier in life. Energy requires more acculturation, maybe a Physics section of a textbook for a class in grammar school. I believe Information comes later to our minds. For a physicists like me, it comes around the time one considers thermodynamics. More precisely when one studies Maxwell-Boltzmann's distribuition in Statisitcal Mechanics.
Engineers and mathematicians take another step with Shannon's insight on the entropic character of Information.
There is a way to axiomatize Thermodynamics with Information Theory. It comes through the so-called, Fundamental Postulate.
Here is my take on this route.
All statistical formulations assume that we don't know everything. With the Information we have, we have to advance to a state of higher Information content.
In our lifetimes then, Information increases unlike energy, that for a closed system remains constant. Of course we also forget, and we have to keep some kind of Information balance sheet. We write and erase information, just like with a Turing strip.
On the other hand, in a closed system Entropy increases until it reaches thermodynamic equilibrium, and some energy is lost into heat, we get free energy (F) which is less than energy (E):
F = E - TS (1)
We have the so-called Partition Function:
These two are related:
There is more to say about these relations, what I want to emphasize here is the minus sign in the free energy formula (1).
The partition function tells us all we know about the system. We can only get average values, because we are not able to follow individual particles. The free energy is what does not become heat, what we can still use.
If we have a closed box with all particles initially in one side, when the closed system reaches equilibrium, we have half of them in one side of the box, and the other half in the other. All energy becomes heat, and free energy is zero. We have maximum Entropy. There is no way to get useful work from this thermodynamic "dead" state.
Information here enters then, to guide us, in this description of a partially known system. We can actually measure entropy and thus Information, it is a valid thermodynamic variable, known to scientists even before Boltzmann connected it, with the concept of probability.
Now turn to the 20th century. Shannon relates this probability concept to human telephone conversations. If a beautiful woman tells us that she loves us, and we didn't expect that, we are really surprised. We get a lot of Information.
My problem now is to connect this wholly human notion, to non-living pieces of stuff.
I have not done that; but I am encouraged by Jacob Bekenstein, Verlinde, and now Gao's concern about this elusive concept in Physics.
My problem now is to connect this wholly human notion, to non-living pieces of stuff.
I have not done that; but I am encouraged by Jacob Bekenstein, Verlinde, and now Gao's concern about this elusive concept in Physics.
No comments:
Post a Comment