WeChat Mini Program
Old Version Features

Design of Analog-AI Hardware Accelerators for Transformer-based Language Models (Invited)

2023 International Electron Devices Meeting (IEDM)(2023)

Cited 1|Views21
Key words
In-memory computing,Non-volatile memory,large language models,analog multiply-accumulate for DNN inference,analog AI,deep learning accelerator,system modeling
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined