|Information, information theory: A character or a character combination contains information when it is clear to the recipient that this character or the character combination appears instead of another possible character or a possible character combination. The supply of possible characters determines to a part the probability of the occurrence of a character from this supply. In addition, the expected probability of the appearance of a character can be increased by already experienced experiences of regularities. The amount of information transmitted by a character depends on the improbability of the occurrence of the character._____________Annotation: The above characterizations of concepts are neither definitions nor exhausting presentations of problems related to them. Instead, they are intended to give a short introduction to the contributions below. – Lexicon of Arguments. |
Information/Chalmers: (I draw upon Shannon 1948)(1).
Information in this sense is not always about something. It is rather the choice of possibilities. Choosing a point from a 3-dimensional space carries more information than a point from a 2-dimensional one. > Complexity).
Information state: an information state can be viewed as a waveform, or another function with a continuous range of functions and values.
Information space: the information space has two types of structure: each complex state will have an internal structure (the combinatorial structure) and each element in this state will belong to a subspace with its own topological structure (the relational structure).
Information space/Chalmers: informartion space (different from Dretske 1981 (2) and Barwise/Perry 1983 (3)) is independent of further considerations with regard to semantic content.
Information: we can find information in the physical as well as the phenomenal world.
The structure of the information space will correspond to a structure of the effect space.
Transferability principle: Physically realized information is only information, if it can be processed (see Mackay 1969 (4)). This corresponds to the transferability in Shannon.
Message/Shannon: Information sets must be separated to count as individual messages. When two physical states of a system are transformed into the same signal, they count as the same message.
Phenomenal Information/Chalmers: there are natural patterns of differences between phenomenal states. These provide the difference structure for an information space. Therefore, we can assume that phenomenal states realize information states. Since every experience has natural similarity and distinction relations, we will always find suitable information spaces.
1. C. E. Shannon,A mathematical theory of communication. Bell Systems Technical Journal 27, 1948: pp. 379-423
2. F. Dretske, Knowledge and the flow of information, Cambridge 1981.
3. J. Barwise and J. Perry, Situations and Attitudes, Cambridge 1983
4. D. M. Mackay Information, Mechanism, and Meaning, Cambridge 1969_____________Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. The note [Author1]Vs[Author2] or [Author]Vs[term] is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition.
The Conscious Mind Oxford New York 1996
Constructing the World Oxford 2014