91ÆÆ½â°æ

 

placeholder

Chengming Zhang, Ph.D.

Assistant Professor of Computer Science

Research Areas: Computing and Software Artificial Intelligence

 

· Ph.D. in Computer Science, Indiana University

Dr. Zhang received his Ph.D. degree in Computer Engineering from Indiana University in May 2024. He is working on building efficient and scalable deep learning system. His research directions include Efficient machine learning systems, effective efficiency algorithms, and Large-scale DL/AI applications.

· Chengming Zhang, Tong Geng, Anqi Guo, Jiannan Tian, Martin Herbordt, Ang Li, and Dingwen Tao. "H-gcn: A graph convolutional network accelerator on versal acap architecture." In 2022 32nd International Conference on Field-Programmable Logic and Applications (FPL), pp. 200-208. IEEE, 2022.

· Chengming Zhang, Geng Yuan, Wei Niu, Jiannan Tian, Sian Jin, Donglin Zhuang, Zhe Jiang et al. "Clicktrain: Efficient and accurate end-to-end deep learning training via fine-grained architecture-preserving pruning." In Proceedings of the 35th ACM International Conference on Supercomputing, pp. 266-278. 2021.

· Sian Jin, Chengming Zhang, Xintong Jiang, Yunhe Feng, Hui Guan, Guanpeng Li, Shuaiwen Leon Song, and Dingwen Tao. 2021. COMET: a novel memory-efficient deep learning training framework by using error-bounded lossy compression. Proc. VLDB Endow. 15, 4 (December 2021), 886–899. https://doi.org/10.14778/3503585.3503597

· Wang, Guanhua, Chengming Zhang, Zheyu Shen, Ang Li, and Olatunji Ruwase. "Domino: Eliminating communication in llm training via generic tensor slicing and overlapping." arXiv preprint arXiv:2409.15241 (2024).

· Jacobs, Sam Ade, Masahiro Tanaka, Chengming Zhang, Minjia Zhang, Shuaiwen Leon Song, Samyam Rajbhandari, and Yuxiong He. "Deepspeed ulysses: System optimizations for enabling training of extreme long sequence transformer models." arXiv preprint arXiv:2309.14509 (2023).