Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.Contents:EntropyInformation SourcesInformation ChannelsSpecial TopicsReadership: Probabilists, analysts and communication engineers.Key Features:Provides a comprehensive and unified presentation of the general theory of informationSuitable for students and professionals who wish to delve further into the subject and explore the research literature, and also for non-experts in information theory who wish to understand what information is and how it is modeled in science