Contributor
Saurabh Suresh Powar

Optimizers in a deep learning framework


Mentors
Milan Curcic, Jeremie Vandenplas, Umashankar Sivakumar, Federico Perini, Henil Panchal, Damian Rouson
Organization
Fortran-lang
Technologies
Linear Algebra, Fortran-Lang
Topics
deep learning
This project aims to enhance the capabilities of neural-fortran, a parallel Fortran framework for deep learning, by implementing additional optimization algorithms. While the framework currently supports training and inference of dense and convolutional neural networks using stochastic and mini-batch gradient descent optimizers, more versatile learning and training from a broader range of problems and datasets require the use of other optimization algorithms such as RMSprop, Adam, and others commonly used in popular deep learning frameworks like Keras. The project will also require work on implementing, documenting, and testing these optimizers, which will broaden the applicability of neural-fortran and allow it to be used for a wider range of deep learning tasks. The successful implementation of these optimization algorithms will provide users with more flexibility and choice when training their models, ultimately improving the performance and capabilities of neural-fortran.