Transformer of Attention Mechanism for Long-context QA
- Mentors
- Bharat, nausheenfatma, Rricha Jalota
- Organization
- DBpedia
In this GSoC project, I choose to employ the language model of transformer with attention mechanism to automatically discover query templates for the neural question-answering knowledge-based model. My ultimate goal is to train the attention-based NSpM model on DBpedia with its evaluation against the QALD benchmark.