> Information is neither a substance nor a property of some substance.
It is soft code...
> Essentially, information is a measure of a system’s randomness, as we will discuss in
> more detail later.
I completely disagree with such reverse twisted logic definition! Instead, the straight and forward logic definition is
exactly the opposite of it. Information is a measure of the organization of a system. In other words, the more information about any particular system, the more it is known...
> Therefore, these assertions that information can exist in some abstract way independent
> of a material medium or that it is conserved (like energy in physics) seem to be dubious
> propositions.
No it isn't! Conservation of information in terms of energy transfer is one of the hottest topics of discussion and debate in many of the advanced circles. Even the simple fact that you are desperately trying to stick with certain definitions suggests that I am correct, otherwise, the fact that I may seem to be more "random" than you (less precise) would strongly suggest to your own definitions that my information content is higher. See, Bernard, you
cannot have it both ways!
> Information can be unearthed, identified, sent, transmitted, or received, and generally
> handled in whichever way, only if it is recorded in the structure of a material medium
> (including electromagnetic waves).. .
..but this is the exact key! Since it is "encoded" in the EM waves themselves, it is never lost as long as the waves (that means the Universe!) are there...
> Information has no relation either to the semantic contents of the message or to the
> particular appearance of the symbols used to record it. It is essential for our
> discussion to note that the more random the transmitted text, the larger the amount of
> information it carries..
I
completely disagree with that ! Again, if that was the case, we couldn't even be communicating here over the Internet! Sending random strings of information back and forth would NOT do it!!!
> p. 66. [message will always mean the meaningful contents of information— different
> from information as defined in information theory]. One of the measures of information
> according to information theory, is a quantity named entropy.
Well, some will prefer to continue to see "entropy" as disorder vs. energy transfer. I cannot change that mind set via any possible argument! This is the argument of looking at the glass as either half empty or as half full. Both positions are correct at the precise moment of looking at the glass, but if the glass is geeting less and less water or more and more water after that initial observation, then only
one of the initial observations was correct. This is an essential concept...
> To all intents and purposes, it behaves like its namesake in thermodynamics. The entropy
> of a text quantitavely characterizes the level of disorder in that text.
Again, I see the exact opposite...
> The total entropy of a text as a whole is proportional to the text’s length and is
> therefore an extensive quantity. A more interesting quantity is the specific entropy,
> which is the entropy of a unit of text, and therefore is an intensive quantity.
Well, if the unit is binary, then there is only two states! A one or a zero. Nothing can be more precise than that 'bit' unit of text. So, is the entropy of that? A word, then has 8 bits and so on...
> Usually it is expressed in as entropy per character and measured in bits per character.
Depends on the base of the system. Binary system are the simplest...
> In the following discussion, unless indicated otherwise, the term entropy will mean the
> specific entropy. There exists a hierarchy of texts in regard to their entropy
> For example, consider a string of the same letter (like A) repeated, say, a million
> times: AAAAAAA.. .etc. This meaningless text is perfectly ordered.
Academic argument!
> The entropy of the text is practically zero. Now consider a text obtained , for example,
> by what we cal the urn technique .[BOM completely random]. . Let a text in an “urn
> language” be, say, a million letters long. This string is almost always gibberish
> (there is some, extremely small probability that a string of an urn language happens
> to be a piece of a meaningful message).
> If, as is overwhelmingly the case, this string is gibberish, in an overwhelming majority
> of situations there is no or very little order in that string.
Obviously, you are trying to ignore or deny the structure of complex infinite strings like phi, pi, etc...
> We call it a random string. The entropy of that meaningless random string is large,
> and so is the information carried by that string. Meaningful texts are located
> somewhere in the middle of the entropy scale, their entropy being much larger than in
> perfectly ordered texts of very low entropy (like AAAAA. ..) but much smaller than
> in the meaningless random texts. Here are some typical numbers. The entropy of a normal
> meaningful text in English (as was estimated already by Shannon) is about 1 bit per
> character. On the other hand, the entropy of a text written in urn language, that is
> the entropy of a randomized sequence of 27 symbols (26 letters plus space), may
> be as high as 4.76 bits per character.
Well, signal (information/data) to noise ratio is much more of an interference and power issue than a low or high entrophy issue! But, even one looks at it as an entropy issue I still disagree with your entire premise since I view all of this as the exact opposite of what you are suggesting...
Bring order out of the cahos is always where the essence is...
-wirelessguru1
The Invisible Universe