I haven’t tried this, but I did some research —SimpleTransformers has info about using any Transformers pretrained model for question-answering: https://github.com/ThilinaRajapakse/simpletransformers#question-answering.
I just filed a bug because some of the example code isn’t working… For now, on CoLab I had no errors with this:
! pip install simpletransformers==0.25.0
from simpletransformers.question_answering import QuestionAnsweringModel
model = QuestionAnsweringModel(‘bert’, ‘monsoon-nlp/hindi-bert’)
I haven’t tested accuracy or output beyond this point. Please let me know if you get some results, good or bad.
If you end up coding on the Transformers module level, they have a question-answering pipeline example (for English) and this Kaggle notebook https://www.kaggle.com/pernelkanic/fast-bert-question-answering-1-paper-s . It looks like SimpleTransformers and these examples all use TFAutoModelForQuestionAnswering
internally