Deep generative models have found a plethora of applications in Machine Learning, and various other scientific and applied fields, used for sampling complex, high-dimensional distributions and leveraged in downstream analyses involving such distributions. This course focuses on the foundations, applications and frontier challenges of diffusion-based generative models, which over the recent years have become the prominent approach to generative modeling across a wide range of data modalities and form the backbone of industry-scale systems like AlphaFold 3, DALL-E, and Stable Diffusion. Topics include mathematical aspects of diffusion-based models (including forward and inverse diffusion processes, Fokker-Planck equations, computational and statistical complexity aspects of score estimation), the use of diffusion models in downstream analyses tasks (such as inverse problems), extensions of diffusion models (including rectified flows, stochastic interpolants, and Schrödinger bridges), and frontier challenges motivated by practical considerations (including consistency models, guidance, training with noisy data).