Abstract:
Aiming at the insufficient learning ability of long distance information for neural network based language model, a recurrent neural network language model with the global word vectors (GloVe) is proposed in this paper. Firstly, global word vectors are trained by GloVe algorithm. Secondly, global word vectors are regarded as feature vector inputs to the recurrent neural network with feature layer. Compared with that of incorporating local word vectors, the GloVe based language model captures the semantic and syntactic information using global statistical information. Experiments on perplexity and continuous speech recognition are performed on Penn Treebank and Wall Street Journal corpus respectively. The results show that the relative perplexity improvement over the conventional recurrent neural network language model reaches 202% and the word error rate of speech recognition system decreases 183%.