The density-functional toolkit (DFTK) is an open-source Julia code for multidisciplinary research on density-functional theory methods. Over the past months one of our ongoing efforts has been to make the code algorithmically differentiable, that is enable the ability to compute derivatives of arbitrary output quantities with respect to arbitrary input quantities. Since this does in particular also enable the computation of "unusual" derivatives, such as the sensitivity of a band gap with respect to the parameters of the XC functional, this opens (amongst others) the prospect to design novel DFT models in a data-driven fashion. Following the emerging opportunities of scientific machine learning one could for example imagine integrating a neural networks as a component of the exchange-correlation functional into the DFT model itself. In this talk I will sketch the current status of algorithmic differentiability in DFTK and hint at possibilities such as the aforementioned ones enabled by an automatically differentiable DFT code.
This talk is part of the Discussion meeting on Machine Learning (
http://gdr-rest.polytechnique.fr/Discussion_Meeting_Machine_Learning), organised by the GDR REST.