Distraction Generation Using T5
In this paper we generate distract answers for multiple-choice questions given a reference text, question, and the correct answer. We fine tune a transformer (T5) on the distraction generation task with an additional term in the loss function to penalize distractors similar to the correct answer.
Fernando Gonzalez and Kevin Golan
Final project, Natural Language Procesing @ ETH Zurich 2020
Paper Code