Welcome to Yue Dai’s Homepage

I am a final-year Ph.D. student in the Department of Computer Science at the University of Pittsburgh, advised by Dr. Youtao Zhang and Dr. Xulong Tang. I received my M.S. in Telecommunications from the University of Maryland and my B.E. in Electrical Engineering from Beihang University, China.

My research interests center on machine learning systems and accelerators, with a focus on efficient systems and robust algorithms for deep graph learning. Specifically, I work on accelerating Graph Neural Network-based deep graph learning models across diverse platforms and exploring the robustness of these applications against adversarial attacks.

🌟 I am actively seeking tenure-track faculty positions and research scientist positions in industry. Please kindly reach out to me regarding any potential opportunities.

CV/Email/Google Scholar/Linkedin

News

[10/2024] One paper is under major revision in ASPLOS2025.
[10/2024] Two paper are submitted to ICLR2025.
[08/2023] One paper is accepted by ICCD2023. Thanks to all the collaborators!
[01/2023] One paper is accepted by ICLR2023 (spotlight). Thanks to all the collaborators!
[10/2022] One paper is accepted by HPCA2023. Thanks to all the collaborators!
[08/2022] One paper is accepted by ArgMining2022 (best paper). Thanks to all the collaborators!
[06/2022] One paper is accepted by CCF Transactions on High Performance Computing. Thanks to all the collaborators!

Publications

* co‑first author

  • Yue Dai, Youtao Zhang, Xulong Tang. 2023. CEGMA: Coordinated elastic graph matching acceleration for graph matching networks. 2023 IEEE International Symposium on High‑Performance Computer Architecture (HPCA’23).

  • Yue Dai, Xulong Tang, Youtao Zhang. 2023. FlexGM: An Adaptive Runtime System to Accelerate Graph Matching Networks on GPUs. 2023 IEEE 41st International Conference on Computer Design (ICCD’23).

  • Sheng Li*, Geng Yuan*, Yue Dai*, Youtao Zhang, Yanzhi Wang, Xulong Tang. 2023. Smartfrz: An efficient training framework using attention‑based layer freezing. The 11th International Conference on Learning Representations (ICLR’23).

  • Yue Dai, Xulong Tang, Youtao Zhang. 2022. An efficient segmented quantization for graph neural networks. CCF Transactions on High Performance Computing (THPC’23), 4(4), 461‑473.

  • Zhexiong Liu*, Meiqi Guo*, Yue Dai*, Diane Litman. 2022. ImageArg: A multi‑modal tweet dataset for image persuasiveness mining. Proceedings of the 9th Workshop on Argument Mining at International Conference on Computational Linguistics (COLING’22).

  • Sheng Li, Geng Yuan, Yawen Wu, Yue Dai, Chao Wu, Alex K Jones, Jingtong Hu, Yanzhi Wang, Xulong Tang. 2024. EdgeOL: Efficient in‑situ Online Learning on Edge Devices. arXiv preprint arXiv:2401.16694.

  • Justin Brody, Samuel Barham, Yue Dai, Christopher Maxey, Donald Perlis, David Sekora, Jared Shamwell. 2016. Reasoning with grounded self‑symbols for human‑robot interaction. 2016 AAAI Fall Symposium Series.

  • Xuejun Liu, Haiying Luan, Wenbai Chen, Yue Dai, Jiandong Liu, Bo Lan. 2014. Electrical nonlinearity pre‑compensation for CO‑OFDM system. Optik, 125(2), 616‑619.