New Foundations for Information Theory New Foundations for Information Theory
SpringerBriefs in Philosophy

New Foundations for Information Theory

Logical Entropy and Shannon Entropy

    • $54.99
    • $54.99

Publisher Description

This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications.
Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or “dit” of the partition will be obtained. 
The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory)  and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits—so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general—and to Hilbert spaces in particular—for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement.
Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory,  maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to  quantum information theory.

GENRE
Nonfiction
RELEASED
2021
October 30
LANGUAGE
EN
English
LENGTH
126
Pages
PUBLISHER
Springer International Publishing
SELLER
Springer Nature B.V.
SIZE
4.8
MB
Dynamics, Information and Complexity in Quantum Systems Dynamics, Information and Complexity in Quantum Systems
2009
Convexity and Concentration Convexity and Concentration
2017
Genealogies of Interacting Particle Systems Genealogies of Interacting Particle Systems
2020
Geometric Science of Information Geometric Science of Information
2021
Mathematical Foundations of Quantum Information and Computation and Its Applications to Nano- and Bio-systems Mathematical Foundations of Quantum Information and Computation and Its Applications to Nano- and Bio-systems
2011
White Noise Analysis And Quantum Information White Noise Analysis And Quantum Information
2017
The Uses of Diversity The Uses of Diversity
2020
Partitions, Objective Indefiniteness, and Quantum Reality Partitions, Objective Indefiniteness, and Quantum Reality
2024
The Democratic Worker-Owned Firm (Routledge Revivals) The Democratic Worker-Owned Firm (Routledge Revivals)
2021
Neo-Abolitionism Neo-Abolitionism
2021
Helping People Help Themselves Helping People Help Themselves
2009
Evaluating Evidence of Mechanisms in Medicine Evaluating Evidence of Mechanisms in Medicine
2018
A Primer to Causal Reasoning About a Complex World A Primer to Causal Reasoning About a Complex World
2024
The Limits of Art The Limits of Art
2020
Scientific Explanation Scientific Explanation
2014
Paradoxes in Probability Theory Paradoxes in Probability Theory
2012
Human Development and Human Life Human Development and Human Life
2016