From all sources I have seen it follows that the proof you can't decrease the amount of entropy in the Universe is given only statistically - the order is just one of the many ways how things can be (with the exception of only energy/temperature entropy, that's clear). That is my first question, is rule that entropy always increase valid for something else (than for entropy defined as an amount of balance of energy in the Universe? The only way out of that is, I think, define information as a physical quantity. Then we would know how much entropy increased.
I have read this answer which defines information as a number of (minimum) YES/NO question you have to ask to perfectly specify the object that carries the information. But this would mean that everything (including every subset or superset which is impossible, how shows the picture) carries the same amount of information - for example if only describeable physical quantities were position and weight, my question for everything could be: "Is it true that it is there and it weighs that?"Now, let's consider a closed system consisting only of three particles.
Also following this definition of information it would be subjective what has more entropy - if I alphabetically order my books have I increased more entropy by the change in balance of energy in the room?
So how to define information correctly? (Btw this blew my mind - if the system had no spin, polarisation or local unbalance (the electrone has mole on one side) I wouldn't have any idea how to describe the position of them in the empty universe in other way than: It's here.)
No comments:
Post a Comment