Curriculum Roadmap

A planned map of content areas. Published topics are linked; everything else is on the roadmap.

Topology & TDA

2 published

Topological Data Analysis — from simplices to persistence diagrams. The geometric heart of the site.

Linear Algebra

Spectral theory, matrix decompositions, and the algebraic backbone of nearly every ML method.

  • Spectral Theorem Planned
  • Singular Value Decomposition Planned
  • PCA & Low-Rank Approximation Planned
  • Tensor Decompositions Planned

Probability & Statistics

Measure-theoretic foundations through PAC learning and concentration inequalities.

  • Measure-Theoretic Probability Planned
  • Concentration Inequalities Planned
  • PAC Learning Framework Planned
  • Bayesian Nonparametrics Planned

Optimization

Convex analysis, gradient methods, and the theoretical tools behind modern training algorithms.

  • Convex Analysis Planned
  • Gradient Descent & Convergence Planned
  • Proximal Methods Planned
  • Lagrangian Duality & KKT Planned

Differential Geometry

Smooth manifolds, Riemannian metrics, and information geometry for probabilistic models.

  • Smooth Manifolds Planned
  • Riemannian Geometry Planned
  • Geodesics & Curvature Planned
  • Information Geometry & Fisher Metric Planned

Information Theory

Entropy, divergences, and the theoretical limits that underpin compression, coding, and learning.

  • Shannon Entropy & Mutual Information Planned
  • KL Divergence & f-Divergences Planned
  • Rate-Distortion Theory Planned
  • Minimum Description Length Planned

Graph Theory

Spectral graph theory and random walks — the mathematical foundation of graph neural networks.

  • Graph Laplacians & Spectrum Planned
  • Random Walks & Mixing Planned
  • Expander Graphs Planned
  • Message Passing & GNNs Planned

Category Theory

Functors, adjunctions, and monads — the abstract language that unifies disparate ML structures.

  • Categories & Functors Planned
  • Natural Transformations Planned
  • Adjunctions Planned
  • Monads & Comonads Planned