Clad is a C++ Clang compiler plugin that employs automatic differentiation to derive user-defined functions, performing source code transformations so that users do not have to conform to custom types to comply with external libraries.
With the integration of clad::gradient to CLAD, a reverse accumulation method for automatic differentiation was introduced. Now, it makes sense to move on to second partial derivatives, in particular, to calculating the Hessian matrix. My work is to build on the existing framework that uses Clang AST to do source transformations on functions, and to implement an efficient Hessian calculation method that extends the capabilities of CLAD using the edge pushing algorithm and Hessian reverse accumulation method. I will also try to extend existing CLAD functions to calculate the Jacobian matrix.