In the last of my Oxford talks I explain how entropy and relative entropy can be understood using certain categories related to probability theory… and how these categories also let us understand Bayesian networks!
The first two parts are explanations of these papers:
• John Baez, Tobias Fritz and Tom Leinster, A characterization of entropy in terms of information loss
• John Baez and Tobias Fritz, A Bayesian characterization of relative entropy.
Somewhere around here the talk was interrupted by...
Published on March 16, 2014 14:42