A structural introduction to probability, statistics and causal networks, Part 1
This somewhat experimental introduction will introduce probability through a new paradigm for how to formalize probability mathematically: the theory of Markov categories. I expect that it will be more intuitive than the standard approach based on measure theory. It also can incorporate non-standard variants of probability theory more flexibly, and the similarity of its graphical calculus to neural networks suggests possible connections with mathematical modelling in neuroscience. Thetopics to be covered will be roughly the following:
1. The compositional structure of random processes and its formalization
in Markov categories
2. Conditionals, Bayesian inference and conditional independence
3. Sufficient statistics
4. Causal networks and causal inference
No background beyond elementary probability theory will be needed.