Monotonicity of Entropy

A Rigorous Proof of an Entropic Monotonocity Theorem

Bachelor Thesis (2025)
Author(s)

T.C.T. van Baar (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Martijn Caspers – Mentor (TU Delft - Analysis)

R.J. Fokkink – Mentor (TU Delft - Applied Probability)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
27-06-2025
Awarding Institution
Delft University of Technology
Programme
['Applied Mathematics']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This thesis will be about explaining a proof of a theorem about entropy presented in a scientific article by Arstein, Barthe Ball and Naor [1] in detail. The original proof is complex, especially for bachelor-level students. The goal of this thesis is to break down that proof, add mathematics and make the ideas understandable for students at this level. The theorem is about mathematical entropy, a concept in probability theory, which is a measure for uncertainty or chaos in an event or outcome. For instance, consider a coin toss: the outcome is uncertain, and this uncertainty is measured by entropy. The greater the uncertainty, the higher the entropy. There is also a concept of entropy in the world of physics, where it is a measure for describing the uncertainty or randomness in which systems evolve. One of the goals of this theorem is its demonstration that these two types of entropy, though defined in different contexts, exhibit analogous behavior. In particular, it shows that the mathematical entropy behaves like the second law of thermodynamics, which states that entropy in an isolated system increases over time. As an illustrative example, when a glass of water is spilled on a table, the
water gradually spreads out, increasing the disorder of the system. The theorem explains that if the entropy of a normalised sum of independent events is taken, then this entropy will increase with the amount of events that are summed. A normalized sum refers to the average obtained by summing independent events and dividing by their count. This seems logical at first, since the uncertainty of for example two separate events seems bigger than that of one event. However, the proof is complex and requires advanced analysis to prove.

The theorem is thus about the monotonicity of entropy of normalised sums. This thesis will connect the entropy to Fisher information, which enjoys nicer analytical properties to use. The concept is clear, entropy is a measure for uncertainty, while Fisher information quantifies the amount of information a random variable carries. This thesis will thus first connect the Fisher information and entropy and show that proving an increase of entropy can be done by proving a decrease in Fisher information. The proof of the decrease in Fisher information will need another theorem. This theorem is about connecting the Fisher information to the world of analysis. This requires some advanced analysis, like Green-Gauss on infinite surfaces and integrating the divergence of functions over hyperplanes. With this connection to the world of analysis eventually the decrease in Fisher information can be reached with some help from a lemma about commuting orthogonal projections in Hilbert spaces. All supporting theorems and lemmas will also be proved in detail in this thesis. In conclusion, this thesis will show a lot about the behaviour of entropy under normalised summation of independent events and show that these monotonically increase, which implies that the mathematical form of entropy behaves like the second law of thermodynamics.

Files

BEP_TCTvanBaar.pdf
(pdf | 1.07 Mb)
License info not available