Automatic Differentation (AD) is a technique for computing derivatives of numerical functions that does not use symbolic differentiation or finite-difference approximation. AD is used in a wide variety of fields, such as machine learning, optimization, quantitative finance, and physics, and the productivity boost generated by parallel AD has played a large role in recent advances in deep learning.

The goal of this project is to implement parallel AD in Haskell using the \verb|accelerate| library. If successful, the project will provide an asymptotic speedup over current implementations for many functions of practical interest, stress-test a key foundation of the Haskell numerical infrastructure, and provide a greatly improved key piece of infrastructure for three of the remaining areas where Haskell's ecosystem is immature.

Organization

Student

Andrew Knapp

Mentors

  • ekmett
  • tmcdonell
  • Alois Cochard
  • Sacha Sokoloski
close

2018