P.3 From Cybernetics & Human Knowing vol. 1, no. 4, 1993

The Controversy over the Concept of Information.

An Overview and a Selected and Annotated Bibliography

Lars Qvortrup

Centre for Cultural Studies, Odense University, Denmark

All red added by me, Burl Grey

In the present article, I will summarize the current controversy over the concept of information by reconstructing the discussion, beginning in the 1940s, of this seemingly innocent, but significant word. It seems to be innocent, because everybody talks about information, information technology and an information society. It is however significant, because its definition implies a whole theory of knowledge. My main aim is to reconstruct the implicit structure of this discussion by identifying a number of characteristic definitions of information, thus providing an overview of the field, trying to avoid simplistic notions of the concept of information. The paper is supposed to act as a tool for continuing the discussion. Therefore, a selected and annotated bibliography is added as the final section. (1)

Please also note in the bibliography the author's debt to Soren Brier's profound article in 1992 which also informed much of my (Burl Grey) understanding.

Information - A Tricky Concept

What, then is the problem regarding the concept of information? What makes is so tricky? The basic problem seems to be that since the 1940s it has aimed at becoming a natural science concept, an objective thing, a substance, a "Ding an sich". But every time it has been close to becoming a decent, objective concept it has been caught up by its fate that information is a concept which implies a human subject. Information isn't just information in itself; it only becomes information when it is information to somebody, i.e. as a mental construction.

Thus, there are many potential definitions of information. At the one end, information may be defined as a thing. At the other end, information may be defined as a psychic construction. Within this continuum, logically speaking, it should be posssible to identify four concepts of information:

Firstly, information may be defined as something (a thing or a substance) existing in the external world like heat, electricity, etc. Thus, information may be defined as something identifiable in the real world, i.e. as a difference in reality.

Page 4

Secondly, information may be defined as something in the external world which causes a change in the psychic system. Here, information may be defined as a difference which makes a difference, i.e. a difference in reality which causes a mental difference.

Thirdly, information may be defined as a change in the psychic system which has been stimulated by a change in the external world. Compared with the second definition, the logical order of the external world and the psychic system has been changed, and here information may be defined as a difference which finds a difference, i.e. a conceptual difference which finds or which is confirmed (or triggered) by something in the outer world.

Fourthly, information may be defined as something only in the human mind, a concept or an idea. Here, again, information may be defined as a difference, now however as a cognitive difference which brings forth (an idea about) an external world.

Even though the discussion about the definition of information is relatively new, the inherent dilemma is as old as the discussion of human understanding.

In our century, the discussion reappeared during the 1940s, and since then all four of the above definitions have been articulated and discussed. In the sections to follow I will summarize some characteristic examples and list typical literature.

Information as a Difference (in the External world)

Information and Observer-Dependency= An Inherent Contradiction in Shannon 's Theory

The logically first position in the controversy of the concept of information is to define information as difference. The fact that something is different from something else makes this something information. If everything were equal, no difference could be observed. A blackboard with nothing on it contains no information. But as soon as a mark appears, information appears. A chalk mark on the blackboard is information. A bright spot in the night sky is taken as information and given a name: a star.

What is the implicit implication of this concept of information? The implication is that information is some sort of substance or at least "something" existing in the material world, independent of an observer.

It seems that a spontaneous image of information is an image of a flowing substance: "something" which can be sent in pipelines from one place to another. This metaphor constituted the basis of the communication model which Claude E. Shannon presented in 1948 in the famous book The Mathematical Theory of Communication, which he published together with Warren Weaver (Shannon and Weaver 1949). The metaphorical image is that information is something which is sent from an information source through a channel or pipeline to a receiver: "A basic idea in communication theory is that information can be treated very much like a physical quantity such as mass or energy", Shannon later on wrote, when he was asked to summarize his theory in Encyclopedia Britannica (Shannon 1972 p. 246B).

Shannon's argument is that if information is similar to a physical quantity, then a measure can be set up comparing the rate at which information is produced, R, by a given information source, and the rate at which it is transmitted through a channel, C. "By a suitable coding or modulation system, the information can be transmitted over the channel if and only if the rate of production R is not greater than the capacity C." (ibid. p. 247) However, in order to set up this measure, the "nature" of information must be considered. "The significant aspect of information from the transmission standpoint is the fact that one particular message is chosen from a set of possible messages. What must be transmitted is a specification of

Page 5

the particular message which was chosen by the information source. The original message can be reconstructed at the receiving point only if such an umambiguous specification is transmitted. Thus in information theory, information is thought of as a choice of one message from a set of possible messages." (Ibid.)

Thus, actually two conflicting metaphors are being used: The well-known metaphor of information as a quantity, like water in the water pipe, is at work, but so is a second metaphor, that of information as choice - a choice made by an information provider, and a forced choice made by an information receiver. Actually, the second metaphor implies that the information sent isn't necessarily equal to the information received, because any choice implies a comparison with a list of possibilities, i.e. a list of possible meanings. Here, meaning is involved, thus spoiling the idea of information as a pure "Ding an sich" Thus, much of the confusion regarding the concept of information seems to be related to the basic confusion of metaphors in Shannon's theory: is information an autonomous quantity, or is information always per se information to an observer? Actually, I don't think that Shannon himself chose one of the two definitions. Logically speaking, his theory implied information as a subjective phenomenon. But this had so wide-ranging epistemological impacts that Shannon didn't seem to fully realize this logical fact. Consequently, he continued to use metaphors about information as if it were an objective substance. This is the basic, inherent contradiction in Shannon's information theory. 2

Acquisition of information

Now, if information is regarded as a substance of some kind, it seems natural to conclude that information is closely related to energy. Either because it costs energy to acquire information, or because it costs energy to throw away - to erase - information. Both views can be found in the scientific literature.

Shannon's theory was based on e.g. Leo Szilard (1929) and R.V.L. Hartley (1928). 3 In 1929 Szilard in a discussion of Maxwell's demon emphasized that it costs energy to measure the position of molecyles. This again clearly implies a materialistic concept of information as something which takes energy to acquire. However, the point didn't become clear among a larger group until it was repeated by Leon Brillouin in his Science and Information Theory (1956).

Information Erasure

The basic argument in the "information acquisition" position is that it costs energy to acquire information. The implicit assumption is that our aim is to collect as much information as possible.

But the contrary standpoint might be true as well. The problem isn't to acquire information, but to throw away unnecessary information. In his book, Tor Norretranders (1991) uses an example from a supermarket. In the cash register all prices are registered and added. Here, a basic paradox of the information theory becomes obvious. According to the theory there is more information in the long list of prices than in the final result. Still, however, we are interested in the result, not in the single prices. Consequently, the problem of information isn't to collect as much information as possible, but to get rid of information, to erase information, the reason being that information is defined by the obeserver, here e.g. the customer in the supermarket.

Thus, information is, as Shannon said, closely related to entropy. However, the problem is not to reconstruct the full message, but to throw away unnecessary information. Much information is equal to complexity, chaos, while "reduced" information, complexity reduction, is equal to order. Thus, we create order by erasing information, or by reducing complexity.

Page 6

This idea that information costs energy - not because it must be collected, but because it implies an erasure of information - was introduced by Robert Landauer in 1961. (Landauer, 1961. cf. Leff and Rex, 1991, p. 13 and p. 21). In 1973, C.H. Bennett summarized and developed the findings. "The usual digital computer program frequently performs operations that seem to throw away information about the computer's history, leaving the machine in a state whose immediate predecessor is ambiguous. (Bennett 1973, p. 525) Bennett (1988) later summarized the historical development of the discussion.

As can be seen, Landauer and Bennett are on the right track: information must be erased, the implicit reason being that information is information to somebody, an observer. Still, however, their metaphors are misguiding, because they continue to regard information as a thing or a substance. Consequently, they don't see the contradiction as an inherent contradiction, but as a contradiction between two closed, but interrelated systems: the information system and the energy system. Only because information is defined as a material thing, does it demand energy in order to be collected or "destructed", "erased", "thrown away", as Bernett puts it.

Information - the Inevitability of the Observer

Information and Entropy

According to Shannon information is a choice among a finite set of known possibilities. The amount of information is equivalent to the surprise value of the message. However, surprise value is constituted by a number of factors. The first is the total number of possibilities. The letter A, for example, represents a certain amount of information, because it has been chosen as one of 26 potential letters. If the alphabet had less than 26 letters, A would represent a smaller amount of information. A second aspect is probability. E represents less information than J, because E is used more often than J, thus being less surprising than J. A third aspect is context. Unless the combination of tenets is totally random, probability depends on context. For example between two consonants there is a high probability of finding a vocal. Thus, from the context we know that here A isn't one of 26, but one of 6 possibilities, i.e. one of the vowels.

From this it seems that there is a basic difference between the two concepts of information. The numbers on the sales ticket were independent of context, while the signs in a telephone conversation or the letters in a book are context dependent. Still, however, they both relate to the same basic idea that the less the order, the more the information. This explains that according to the mathematical theory of information the sales ticket contains more information than the letters in the book. In both cases information is related to entropy, and the less order the more information is provided. "The formula for the amount of information is identical in form with equations representing entropy in statistical mechanics, and suggests that there may be deep-lying connections between thermodynamics and information theory." (Shannon 1972 P. 247).

Information and meaning

Here, the inherent contradiction becomes evident. According to Shannon, in his theory the concept of information has nothing to do with meaning. "The signals or messages (in his communication model, LQ) need not be meaningful in any ordinary sense." (Shannon 1971 p. 246B) This position has been widely accepted, for example by Umberto Eco. In his book Semiotics and the Philosophy of Language he states that "...the true concern of the information theorist is not the correlation between signals (as if they were expressions of something) and their correlated content. The specific concern of the theory is the most economic way of sending a message so that it does not produce ambiguity." Thus, "...the real problem of the theory is the internal syntax of the system of 1's

Page 7

and O's, not the fact that the strings generated by this syntax can be associated to another sequence (for instance of alphabetic letters) so to correlate them (as expressions) to a 'meaning'." (Eco 1986 p. 169)

On the other hand Shannon says that information is related to probability and redundancy. However, meaning and probability are related concepts. Firstly, probability is probability to somebody. Thus, an interpretant is implied. Secondly, probability is probability in relation to a set of expectations, that is a semantics - a system of meanings.

To be a bit more specific, one could say that probability is related to expectation. If the amount of information related to A is small, the explanation is that the receiver expected an A. On the contrary, if I didn't expect an A, the amount of information in its appearance is big. Actually, information isn't just information, but is information in relation to a specific expectation. By elaborating on the mathematical definition of information we realize that information cannot be defined as "something" in the external world. Information is "something" in the outer world only so far as it is related to an expectation, i.e. to something in the human mind. Here, we are very close to Bateson's definition of information as a difference which makes a difference.

Actually, Bateson has presented the same argument in his critique of Shannon's theory of information. According to Bateson, engineers and mathematicians have tried to avoid the problem regarding meaning: "The engineers and mathematicians have concentrated their attention rigorously upon the internal structure of message material." By doing this, "...the engineers believe that they can avoid the complexities and difficulties introduced into communication theory by the concept of 'Meaning'." However, this is fruitless. When Shannon says that information is inversely proportional to probability, then information is related to meaning, as meaning is a certain way of reducing complexity, of establishing a pattern, a system. 4 Bateson writes: "I would argue (...) that the concept 'redundancy' is at least a partial synonym of 'meaning'. As I see it, if the receiver can guess at missing parts of the message, then those parts which are received must, in fact, carry a meaning which refers to the missing parts and is information about those parts." (Bateson 1972 p. 413 and p. 414) 5

Information and negentropy

In his book Cyhernetics from 1948 Wiener asserts that he and Shannon developed the same theory of information independently of each other. Wiener writes that "...we had to develop a statistical theory of the amount of infonnation, in which the unit of information was that transmitted as a single decision between equally probable alternatives. This idea occured at about the same time to several writers, among them the statistician R.A. Fisher, Dr. Shannon at the Bell Telephone Laboratories, and the author." In reality, however, Wiener's theory of information is not the same, but the opposite of Shannon's theory. While to Shannon information is inversely proportional to probability, to Wiener it is directly proportional to probability. To Shannon, information and order are opposed; to Wiener, they are closely related. "Just as the amount of information in a system is a measure of its degree of organization, so the entropy of a system is a measure of its degree of disorganization; and the one is simply the negative of the other." (Norbert'Wiener 1961 (first edition 1948) p. 10 and p. 11).On p. 64 in the same book it is written a bit more formalized: "...amount of information, being the negative logarithm of a quantity which we may consider as a probability, is essentially a negative entropy."

In his more popular presentation of cybernetics in The Human Use of Human Beings his definition is the same: "Amount of information is a measure of: the degree of order which is peculiarly associated with those patterns which are distributed as messages in time." (Wiener

Page 8

1950 p. 21) A few sentences later he says that such as the amount of information of a system is a measure of its degree of organization, similarly the degree of entropy of a system is a measure of its degree of disorganization.

This concept of information is the one which has been carried on by Leon Brillouin, and is directly opposite to the concept of information in Shannon's theory. 6 Where Shannon defines information as chaos, entropy, Wiener and Brillouin define information as an equivalent to order, i.e. negentropy.

Recently, the theory of information as a material reality has been reintroduced by the English biologist Tom Stonier from Bradford University. According to Stonier information exists as material reality independently of human perception: "Information enists. It does not need to be perceived to exist. It does not need to be understood to exist. It requires no intelligence to interpret it. It does not have to have meaning to exist. It exists." (Stonier 1990 P 21)

The idea that information is something real, a difference in the outer world, is similar to Szilard's, Hartley's and Shannon's ideas of information. However, his definition of information is the opposite of Shannon's. Rather, it is related to Norbert Wiener's cybernetically based concept, even though Stonier doesn't refer to Wiener. Stonier defines information as the ability to organize a system or to maintain a system's organization. "Information is defined as the capacity to organise a system - or to maintain it in an organised state." (Ibid. p. 26) More specifically the definition goes as follows: "Information is an inverse exponential function of entropy. (...) As entropy S increases, information I decreases. As entropy approaches infinity, information approaches zero." (Ibid. p. 38 and p. 57)

But why then distinguish between information and organization? Because information according to Stonier possesses qualities which cannot be found in organizations. "Information is a quantity which may be altered from one form to another. Information is a quantity which may be transferred from one system to another." (Tbid. P 26) Here information is related to energy, but different from more concrete concepts like order, organization, pattern or system. 7

Stonier is absolutely aware that his information theory is the opposite of Shannon's information theory. Still, however, to me the real challenge doesn't lie in the opposition between Shannon and Wiener, but in their mutual problem that information is information only in relation (be it directly or inversely proportional) to an observer's idea of order, organization, etc. Here, Stonier represents a step back, because he doesn't elaborate this important and potentially fruitful contradiction; instead he closes his eyes, insisting on the objectivity and autonomy of information.

Critique

It is often said that the main problem of Shannon's theory of information is that it has been widely misused. Among others, Heinz von Foerster (1980, p. 20-21) has emphasized that "...when we look more closely at these theories, it becomes transparently clear that they are not really concerned with information but rather with signals and the reliable transmission of signals over unreliable channels..."

But is Shannon innocent? Only partly. On the one hand in his famous MIT master's thesis (Shannon and Weaver 1948) he emphasizes that his theory has nothing to do with communication contents or semantics: "The fundamental problem of communication is that of reproducing at one point exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that they are selected from a set

Page 9

of possible meanings." Shannon, C.E. & Weaver, W (1969)

On the other hand in this very same section the inherent contradiction is evident. In one sentence Shannon says that the semantic aspects are irrelevant. In the next he says that the significant aspect is that the messages are selected from a set of possible meanings.

Thus, it is absolutely relevant to criticize Shannon's information theory as did Bateson (cf. above). The problem is, however, that most criticism implies that Shannon's theory was only a substance-oriented information theory. Consequently, the criticism tends to reduce Shannon's theory instead of enriching it. For example, in his paper "Perception of the future and the future of perception" Heinz von Foerster observes that today both "information" and "knowledge" are "...persistently taken as commodities, that is as substance", while lectures, books, etc. should not be understood as containers of information, but as "vehicles for potential information. (von Foerster 1984 p. 193 and p. 194). Similar statements can be found on p. 216 (in the paper "Technology: What Will It Mean to Librarians" from 1970) and on p. 237 (in the paper "Thoughts and Notes on Cognition" from 1969), both related to the metaphors derived from the use of computers. In the latter paper, "information" is taken as an example of "pathological semantics". "This poor thing (information, LQ) is nowadays 'processed', 'stored', 'retrieved', 'compressed', 'chopped, etc., as if it' were hamburger meat. Since the case story of this modern disease may easily fill a entire volume, I only shall pick on the so-called 'information storage and retrieval system' (...). Of course, these systems do not store information."

Even though this is an absolutely correct summary of most applied information theory, in relation to the original theory it is a reduction. Rather, we should try to complete the work which Shannon started in the 1940s, partly by developing an adequate set of information and communication metaphors. Here, Niklas Luhmann (cf. Luhmann 1984 p. 193f) has given us three good initial suggestions:

Firstly, the substance metaphor suggests that the sender gives away something which is received by the receiver. But the sender doesn't lose anything, not even a single bit, by sending information.

Secondly, it suggests that the information which has been sent is identical to the information received. Normally, this isn't true. What I wrote is not necessarily' what you read. What you said isn't necessarily what I heard.

Thirdly, it suggests that communication is a two-step and thus, a one-way process; the sender sends, and the receiver receives. Again, this isn't true. Just try to phone to somebody who doesn't answer.

So, how do we change the metaphor in a productive way? In my opinion the concepts of autopoiesis and self-reference take us in the right direction.

The Theory of Antopoiesis and the Concept of Information

Autopoiesis

In the early 1970s the biologists Humberto Maturana and Franscisco Varela carried through a scientific revolution which among other things deeply affected the notion of information.

Also according to Maturana and Varela, information is not a "thing" or a "substance" in an observed system. "Notions such as coding and transmission of information do not enter in the realization of a concrete autopoietic system because they do not refer to actual processes in it. (...) The notion of coding is a cognitive notion which represents the interactions of the observer, not a phenomenon operative in the observed domain." (Maturana and Yarela 1980 p. 90, first published 1973)

P. 10
Home