📚TheoryIntermediate
Sequence-to-Sequence with Attention
Sequence-to-sequence with attention lets a decoder focus on the most relevant parts of the input at each output step, rather than compressing everything into a single vector.
#sequence-to-sequence#attention#encoder-decoder+12