WeChat Mini Program
Old Version Features

ProTEA: Programmable Transformer Encoder Acceleration on FPGA

SC-W '24 Proceedings of the SC '24 Workshops of the International Conference on High Performance Computing, Network, Storage, and Analysis(2025)

Cited 0|Views3
Key words
FPGA,Transformer,Attention,Neural Networks,Encoder,High-Level Synthesis,Natural Language Processing,Hardware Accelerators
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined