Differentiable Tensor Networks
- Mentors
- JinGuo Liu
- Organization
- The Julia Language
This project aims to improve the tooling for tensor network algorithms in julia and demonstrate advantages of julia - composability, performance, ecosystem among others - by implementing cutting edge differentiable tensor network algorithms that integrate tools from machine learning, quantum mechanics and mathematical optimisation. The end-result will be a new julia implementation of the einsum-interface and a cutting edge package for differentiable tensor network algorithms, reproducing results of a recent paper that represents the new state-of-the-art in infinite two-dimensional tensor networks.