Information theory


Information theory emerged from trying to answer the question of how the information content of a message can be expressed in a precise mathematical form. The first measure for information was due to the electrical engineer Ralph Hartley and is basically the number of yes/no decisions that are required to select a particular message out of a set of equiprobable messages. This measure was later adopted by Claude E. Shannon and applied to the general case of messages with different expectation probabilities.

The concept of information which originated from the technical problems of telecommunication led ultimately to the development of a general theory of information transmission, data compression, coding etc. While the measure of information content in classical information theory was always based upon the source of the information, “algorithmic information theory” (as developed by Andrei N. Komogorow, Ray Solomonoff und Gregory Chaitin) relates the information content of a message solely to its complexity.

Furthermore, the mathematical and technical problems that arise in connection with computer-aided information-processing have initiated a new, independent discipline – that of informatics. Informatics and information theory together make up a powerful instrument for the investigation of complex phenomena in Nature, in technology and in society.


                                                                                                                                 Copyright © 2010 COMPASS. All rights reserved

Photo: ⓒ A.Antl/