This note may help me to collect my thoughts. Recently I have been thinking about my life contribution before I die. One of my deep interests has been the nature of entropy. I do not give references here, but I have read the work of some thinkers, some very successful, and some not so much.
I see two intimately related aspects of the concept of entropy, one is the entropy of the system, the other is my perception of that entropy. Objects are mental constructs, we choose where something begins and where something ends. When we count the number of objects we are using a filter. The filter is not part of what is outside. Our filter introduces randomness, and therefore increases the entropy, as another observer looking at us trying to count the objects may himself perceive. Among human beings we use the peer review method to agree on how many objects we all count. But I feel that either another type of intelligence, or maybe the thing itself, whatever that means, gives a similar number of parts.
Shannon defined information taking both aspects into consideration, a message is not something in itslef, it depends on who gets the message. The recipient of information has expectations, Shannon's information incoprporates in an intrinsic way those expectations.
Now Quantum Information. The wave function of the Universe counts all the ways the Universe could have been when we ask a question. Both the question and the Universe define the probablilities we give for the Inflationary Universe say, according to Hawking they are much bigger than zero. If I understand correctly what Hawking is saying, Alan Guth has to be given a Nobel Prize.
No comments:
Post a Comment