YI Yenan, BIAN Yijie. Research on question generation model based on improved attention mechanism[J]. Microelectronics & Computer, 2022, 39(4): 49-57. DOI: 10.19304/J.ISSN1000-7180.2021.1082
Citation: YI Yenan, BIAN Yijie. Research on question generation model based on improved attention mechanism[J]. Microelectronics & Computer, 2022, 39(4): 49-57. DOI: 10.19304/J.ISSN1000-7180.2021.1082

Research on question generation model based on improved attention mechanism

  • Question generation is a widely used natural language generation task. Most of the existing researches use sequence-to-sequence model based on recurrent neural network. Due to the "long-term dependence" problem, the encoder can not effectively capture the relationship information between words when modeling sentences. In addition, in the decoding stage, the decoder usually only uses the single-layer or top-layer output of the encoder to calculate the global attention weight, which can not make full use of the syntax and semantic information in sentences. Aiming at the above two defects, a question generation model based on improved attention mechanism is proposed. This model adds the self-attention mechanism to the encoder to extract the relationship information between words, and use the multi-layer outputs of the encoder to jointly calculate the global attention weight when the decoder generating question words. The improved model is tested on the SQuAD dataset. The experimental results show that, compared with the baseline model, the improved model obtains better scores in the two evaluation methods, and through the example analysis, it can be seen that the quality of natural language questions generated by the improved model is higher.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return