In this GSoC project, I choose to employ the language model of transformer with attention mechanism to automatically discover query templates for the neural question-answering knowledge-based model. My ultimate goal is to train the attention-based NSpM model on DBpedia with its evaluation against the QALD benchmark.

Organization

Student

Stuart Chan

Mentors

  • Bharat
  • nausheenfatma
  • Rricha Jalota
close

2019