Entropy and the Central Limit Theorem

08 Aug 2016

Statistics

I’ve read through various explanations of entropy before, but I’ve mostly only understood the concept in the domain of information theory, where entropy is a measure of how dense an information representation is. When it comes to the physics concept, I could only regurgitate basic statements like “The entropy of the universe always increases” without much deep understanding.

I recently came across Aatish Bhatia’s great explanation of entropy, and all that changed. Not only did the article keep my attention with a nice humorous touch, it did a great job of explaining entropy in a physical sense. What struck me the most is how closely the concept of entropy in physics is related to the Central Limit Theorem in statistics. I highly recommend reading Bhatia’s article for an accessible explanation of the physics definition of entropy.

comments powered by Disqus