Encyclopedia Britannica's says that "the chief concern of information theory is to discover mathematical laws governing systems designed to communicate or manipulate information. It sets up quantitative measures of information and of the capacity of various systems to transmit, store, and otherwise process information. Some of the problems treated are related to finding the best methods of
using various available communication systems and the best methods for separating the wanted information, or signal, from the extraneous information, or noise".
The study of information theory is also concerned with the definition of notion of information in a
general sense and with a unified information theory that is supposed to contain all the statements of existing information theories.

### Subcategories 8

### Related categories 2

### Sites 13

Algorithmic Information Theory

Open problems and links to software.

Basics of Information Theory

A brief introduction to information theory by David S. Touretzky of Carnegie Mellon University.

The Classic Definition of Information

From the Web Dictionary of Cybernetics and Systems.

A Discipline Independent Definition of Information

An article about some sides of Information Theory written by Robert M. Losee. Also there is link to pdf-version of article.

Entropy on the World Wide Web

A collection of links to on-line expository articles on entropy and theory of information, maintained by Roland Gunesch (University of Hamburg).

Information Theory

Wikipedia article on this branch of applied mathematics and electrical engineering involving the quantification of information.

Information Theory and Music

An article on this subject. Definition of information theory, probability and Markov chains.

Information Theory Resources

Small collection of links about Information Theory and Claude Shannon.

Introduction To Algorithmic Information Theory

An introduction to the synthesis of computation and information theory by Nick Szabo.

Linear Complexity: A Literature Survey

The linear complexity (LC) of a sequence is the size in bits of the shortest linear feedback shift register (LFSR) which can produce that sequence. The measure speaks to the difficulty of generating, and perhaps analyzing, a particular sequence.

Organs of Computation

A Talk With Steven Pinker.

Remembering Claude Shannon

Describes the development of information theory and Claude Shannon's contribution to the field.

A Short Course in Information Theory

Eight lectures on information theory by David MacKay.

Last update:

March 21, 2016 at 1:48:21 UTC
Copyright © 1998-2016 AOL Inc.

Built by CMBuild