基本信息
views: 76

Bio
My current research, revolving around large language models (LLM), is dedicated to more efficient and performant learning with limited supervision. In particular, I explore novel learning paradigms, in which humans and machines interact more effectively. I also investigate the learning objective design in order to improve the model trained with noisy/limited supervision. I am also interested in how to strategically select data to maximize annotation efficiency.
Research Interests
Papers共 18 篇Author StatisticsCo-AuthorSimilar Experts
By YearBy Citation主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
Rushi Qiang,Yuchen Zhuang,Yinghao Li, Dingu Sagar V K,Rongzhi Zhang, Changhao Li, Ian Shu-Hei Wong, Sherry Yang, Percy Liang,Chao Zhang,Bo Dai
arxiv(2025)
Cited0Views0Bibtex
0
0
Yuchen Zhuang,Jingfeng Yang,Haoming Jiang,Xin Liu,Kewei Cheng, Sanket Lokegaonkar,Yifan Gao, Qing Ping, Tianyi Liu,Binxuan Huang, Zheng Li,Zhengyang Wang,Pei Chen, Ruijie Wang,Rongzhi Zhang, Nasser Zalmout,Priyanka Nigam,Bing Yin,Chao Zhang
North American Chapter of the Association for Computational Linguisticspp.6041-6068, (2025)
Cited0Views0EIBibtex
0
0
Annual Meeting of the Association for Computational Linguisticspp.15623-15636, (2024)
Annual Meeting of the Association for Computational Linguisticspp.15992-16030, (2024)
ICLR 2024 (2024)
COLM 2024 (2024)
NeurIPS 2024 (2024)
Cited4Views0EIBibtex
4
0
ICLR 2023 (2023)
Cited0Views0Bibtex
0
0
Load More
Author Statistics
#Papers: 18
#Citation: 284
H-Index: 7
G-Index: 13
Sociability: 4
Diversity: 1
Activity: 13
Co-Author
Co-Institution
D-Core
- 合作者
- 学生
- 导师
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn