KL-DNAS: Knowledge Distillation-Based Latency Aware-Differentiable Architecture Search for Video Motion Magnification
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS(2024)
Key words
Knowledge distillation (KD),latency aware,motion magnification,motion manipulation,neural architecture search (NAS)
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined