LI Hua, JUE Dan, ZHANG Wen-Lin, WANG Bing-Ti, LIANG Yu-Long. Recurrent Neural Network Language Model with Global Word Vector Features[J]. JOURNAL OF SIGNAL PROCESSING, 2016, 32(6): 715-723. DOI: 10.16798/j.issn.1003-0530.2016.06.010
Citation: LI Hua, JUE Dan, ZHANG Wen-Lin, WANG Bing-Ti, LIANG Yu-Long. Recurrent Neural Network Language Model with Global Word Vector Features[J]. JOURNAL OF SIGNAL PROCESSING, 2016, 32(6): 715-723. DOI: 10.16798/j.issn.1003-0530.2016.06.010

Recurrent Neural Network Language Model with Global Word Vector Features

  • Aiming at the insufficient learning ability of long distance information for neural network based language model, a recurrent neural network language model with the global word vectors (GloVe) is proposed in this paper. Firstly, global word vectors are trained by GloVe algorithm. Secondly, global word vectors are regarded as feature vector inputs to the recurrent neural network with feature layer. Compared with that of incorporating local word vectors, the GloVe based language model captures the semantic and syntactic information using global statistical information. Experiments on perplexity and continuous speech recognition are performed on Penn Treebank and Wall Street Journal corpus respectively. The results show that the relative perplexity improvement over the conventional recurrent neural network language model reaches 202% and the word error rate of speech recognition system decreases 183%.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return