Automatic differentiation

Date:

Most people are familiar with two common ways of calculating derivatives, symbolic differentiation and numerical approximation (e.g. finite differences). There is also a third method called “automatic differentiation” (AD). It is a way of calculating exact derivatives of computer code by decomposing it to a sequence of simple functions and applying the chain rule. AD is one of the key technologies which enabled the recent rise of deep learning, where it is known as “backpropagation”. It lies at the heart of hugely popular machine learning libraries such as PyTorch and Google’s Tensorflow. The main benefit of using AD in the context of physics and astronomy is that it enables the use of gradient based optimization methods and advanced sampling methods such as Hamiltonian Monte Carlo with complex high-dimensional models. In this talk Fran will explain how AD works and how you can start writing differentiable code.

Link to slides