Neural Machine Translation Attention

less than 1 minute read

Published:

Neural Machine Translation by jointly learning to align and translate.

ColorMeaning 
RedNotes, Views, Understanding 
YellowCurrent paper methods, topics 
GreenReference to other papers 
PurpleDoubts, Quesions, Issues 

The paper demonstrates the use of attention with Seq-to-Seq Encoder-Decoder based architecture, training the end to end pipeline for better Machine translation, solving mainly the problems with long range dependency. This was the first paper which used the attention module with Seq-to-Seq architecture.



Citation

@misc{title={Neural Machine Translation by Jointly Learning to Align and Translate}, 
      author={Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio},
      year={2016},
      eprint={1409.0473v7},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}