- USD 49.99
Descripción de editorial
Information Theory is studied from the following points of view: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov–Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources.
The main purpose of this book is to present information channels in the environment of functional analysis and operator theory as well as probability theory. Ergodic, mixing, and AMS channels are also considered in detail with some illustrations. In this second edition, channel operators are studied in many aspects, which generalize ordinary channels. Also Gaussian channels are considered in detail together with Gaussian measures on a Hilbert space. The Special Topics chapter deals with features such as generalized capacity, channels with an intermediate noncommutative system, and von Neumann algebra method for channels. Finally, quantum (noncommutative) information channels are examined in an independent chapter, which may be regarded as an introduction to quantum information theory. Von Neumann entropy is introduced and its generalization to a C*-algebra setting is given. Basic results on quantum channels and entropy transmission are also considered.
Contents:EntropyInformation SourcesInformation ChannelsChannel OperatorsGaussian ChannelsSpecial TopicsQuantum ChannelsReferencesGlossaries of AxiomsIndices
Readership: Graduate students and researchers from Mathematics and Communication Engineering.