About Me
I am a final-year Ph.D. candidate at UC San Diego. Previously, I received my bachelor degree in Electronic Engineering from Shanghai Jiao Tong University. My research focuses on large language model efficiency, including distributed training acceleration and algorithm–system co-design for scalable and secure LLM applications. I'm honored to be recognized as the 2025 Machine Learning and Systems Rising Star.
Experience
Work
PyTorch Compiler @ Meta
Research Intern
Compiler optimization passes for SimpleFSDP and AutoParallel
Education
Shanghai Jiao Tong University
B.E. in Electronic Engineering
Research
My research focuses on building scalable, efficient, and trustworthy systems. I work across the full stack algorithm-system co-design and co-optimization to enable secure and safe AI. For full publication list, please checkout my Google Scholar.
Scalable Computing
AdaGL: Adaptive Learning for Agile Distributed Training of Gigantic GNNs
Provenance of AI-Generated Content
REMARK-LLM: A Robust and Efficient Watermarking Framework for Generative Large Language Models
Robust Zero Knowledge Verifiable Watermarking of Code LLMs with ML/Crypto Co-Design
Edge AI IP Protection
EmMark: Robust Watermarks for IP Protection of Embedded Quantized Large Language Models
AttestLLM: Efficient Attestation Framework for Billion-scale On-device LLMs
Security of Chip Design
Awards
-
2025
Machine Learning and Systems Rising Star
-
2024
Qualcomm Innovation Fellowship Finalist
-
2023
DAC Young Fellow
-
2021
ECE Department Fellowship at UC San Diego
-
2018-2020
Academic Excellence Scholarship at SJTU
Service
- Conference Reviewer ICCV; CVPR; ICASSP; ICML; EMNLP; ACL, IJCNN
- Journal Reviewer IEEE TDSC; IEEE TCAD; IEEE TNNLS; IEEE TIFS
- AE Committee CCS, NDSS
Misc
I have a (non-exhaustive) reading list of research papers, talks, and books that I enjoy.
Fun facts about me: