WeChat Mini Program
Old Version Features

Linear Attention is (maybe) All You Need (to Understand Transformer Optimization)

ICLR 2024(2024)

Cited 53|Views62
Key words
Transformer,optimization,adam,clipping,heavy-tailed noise,directional smoothness
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined