Mabble Rabble
random ramblings & thunderous tidbits
21 October 2019
Alternative Sequences
Attention
- memory added to other networks to guide focus
Transformers
- networks that use attention exclusively instead of recurrent and convolutional layers
Temporal Convolutional Networks
- CNN designed for sequences
Attention Is All You Need
Newer Post
Older Post
Home