If you go to the internet and google shannon + communication theory-- Shannon being the father of modern information theory AND what Dembski claims to be using for ID, you get things like this from (http://)www.exploratorium.edu/complexity/CompLexicon/complexity.html
"Complexity", as a character of natural processes, has two distinct and almost opposite meanings.
The first, and probably the oldest mathematically, goes back to Andrei Kolmogorov's attempt to give an algorithmic foundation to notions of randomness and probability and to Claude Shannon's study of communication channels via his notion of information. In both cases, complexity is synonymous with disorder and the lack of structure. The more random a process is, the more complex it is. An ideal gas, with the molecules bouncing around in complete disarray, is complex as far as Kolmogorov and Shannon are concerned. Thus, this sense of "complexity" refers to degress of complication.
The second sense of "complexity" refers instead to how structured, intricate, hierarchical, and sophisticated a natural process is. That is, in this sense, "complexity" is an indicator of how many layers of order or how many internal symmetries are embedded in a process. The human brain is complex in this sense due to the high degree of structure in its neural architecture, in the many different scales of information processing from perception to interpretation of stimuli, and in the intricate social behaviors it supports in human groups.
%%%%
Since Kolmogorov was brought up here, I'll provide Perakh's take on Dembski's misuse of the theory:
pp. 60- As mentioned in the preceding section, probability theory with all of its variations and competing concepts, seems to be much better substantiated than Dembski’s quasi-rigorous exercise. There is an analogous situation in the case of complexity as well. Although complexity theory is not yet developed to the same extent as probability theory, it has to its credit a number of substantial achievements. In particular a theory of complexity was developed in the sixties under the name of Algorithmic theory of Probability/Randomness (ATP). Although some facets of this theory have not yet been fully clarified, it is powerful and largely consistent. . . Dembski discusses this theory in The Design Inference (pp. 167-169), but seems to ignore it when approaching complexity from his own standpoint.
Algorithmic Theory of Probability/Randomness defines complexity (often referred to as Kolmogorov complexity) in connection with the concept of randomness. This theory is often discussed in terms of strings of binary digits, but it is actually applicable to a wide variety of situations. In ATP complexity is assigned a number. Kolmogorov complexity depends on the degree of randomness of a system. The more random the system, the longer in binary representation becomes the algorithm (or program) which describes the system. The fully random system cannot be described by an algorithm (or program) which is shorter than the system itself. In other words, a fully random system and the algorithm that describes it are almost identical. The less random a system is (i.e., the more its structure is determined by a rule) the shorter the algorithm (or program) describing that system can be. Consequently, ATP defines complexity of a system as the minimal size of an algorithm (or program) which can “describe” the system. As can be seen, unlike Dembski’s exercise in complexity, ATP does not relate that concept to the difficulty of solving the problem.
.. .
[many classes of events for which Dembski’s scheme is not only inadequate, but in which the relation between complexity, difficulty, and probability is opposite to that assumed by Dembski].. . First there are situations in which we are not interested at all in solving any problem but may be quite interested in estimating the complexity of a system. In such situations Dembski’s definition of complexity (which equates it with difficulty of solving a problem) is irrelevant. Such situations are common in psychology. Geography, economics, crystallography, and in many areas of knowledge.. .[examples- depositing electrochemically layer of molybdenum , very difficult to do with pure molybdenum. However, 98% molybdenum the rest being metallic nickel and a small percentage of hydrogen, “Such an alloy can be deposited under a rather wide range of conditions and its properties are reasonably close to those of pure molybdenum, thus solving the problem at hand. However, the alloy in question has a much more complex structure than a pure molybdenum layer has. It contains a conglomerate of various phases, such as NiMo, at least three phases of Ni, hydrides of both Ni and Mo, etc. In this case the difficulty of solving the problem at hand in no way matches the complexity of the system. .. For example, a steep slope covered by a smooth layer of ice has a much simpler relief than a rocky slope of a very complex shape. However, the difficulty of climbing over the smooth and steep ice may substantially exceed the difficulty of climbing over a rocky slope (which may be actually quite easy if the rocks, as is often the case, provide multiple cracks and ledges that immensely facilitate the climber’s ascent). In computer programming-- the most efficient program (or algorithm) is one that requires the minimal number of the steps. Such an algorithm is defined in computer science as having the minimal complexity. However, there is no parallelism between the complexity in the above sense and the complexity of the algorithm’s structure. Often the computational algorithm that is the most efficient (i.e. has the minimal complexity in the sense of computer science) is much more complex in its structure than the less-efficient algorithms.] The situations in the above examples, while contradicting Dembski’s scheme, are fully compatible with the concept of complexity in ATP.
%%%
Now WG, please provide a QUOTE (not your fevered imagination's idea) from an authoritative source that disproves what I have posted about the definiton of "information" and "complexity" according to Shannon and his student Kolmogorov. Remember the discussion is whether Dembski (what you think is not relevant) was misusing Shannon and Kolmogorov's definitions, which are accepted by the field of information theory.
Bernard