21 October 2019

Alternative Sequences

  • Attention - memory added to other networks to guide focus
  • Transformers - networks that use attention exclusively instead of recurrent and convolutional layers
  • Temporal Convolutional Networks - CNN designed for sequences

Attention Is All You Need