Phrase Representation Learning
- Learning Phrase Representations Using RNN Encoder–Decoder for Statistical Machine Translation
- two Recurrent neural networks Basic RNN Architectures that is together able to learn the mapping from a sequence of an arbitrary length to another sequence, possibly from a different set, of an arbitrary length.
- either score a pair of sequences (in terms of a conditional Probability) or generate a target sequence given a source sequence
- jointly trained to maximize the conditional Probability of a target sequence given a source sequence
- reset gate and an update gate that adaptively control how much each hidden unit remembers or forgets while reading/generating a sequenc
- RNN Encoder–Decoder to score each phrase pair in the phrase table
- capture linguistic regularities in the phrase pairs well
- BLEU