Hello, I'm Peizhong Ju, an Assistant Professor in the Department of Computer Science of University of Kentucky. I earned my Ph.D. degree from Purdue University in 2021 and my B.S. degree from Peking University in 2016.
My research area includes machine learning, smart grid, optimization, and wireless communication. My recent research focuses on theories for explaining the performance of machine learning models. Before that, I worked on power systems and wireless communication. Generally speaking, the goal of my research is to use rigorous mathematical analysis (including probability theory, optimization, game theory, and random matrix theory) to understand the fundamental limits of a complex system under uncertainty and/or disturbance.
You can contact me at peizhong.ju@uky.edu
My CV (as a pdf file) is here (Last updated: Mar 2, 2025).
Check out my Google Scholar profile for a list of my publications.
@article{li2024find, title={How to Find the Exact Pareto Front for Multi-Objective MDPs?}, author={Li, Yining and Ju, Peizhong and Shroff, Ness B}, journal={arXiv preprint arXiv:2410.15557}, year={2024} }
@article{liang2024theory, title={Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers}, author={Liang, Yuchen and Ju, Peizhong and Liang, Yingbin and Shroff, Ness}, journal={arXiv preprint arXiv:2410.13746}, year={2024} }
@article{liang2024broadening, title={Broadening target distributions for accelerated diffusion models via a novel analysis approach}, author={Liang, Yuchen and Ju, Peizhong and Liang, Yingbin and Shroff, Ness}, journal={arXiv preprint arXiv:2402.13901}, year={2024} }
@article{xu2024psmgd, title={PSMGD: Periodic Stochastic Multi-Gradient Descent for Fast Multi-Objective Optimization}, author={Xu, Mingjing and Ju, Peizhong and Liu, Jia and Yang, Haibo}, journal={arXiv preprint arXiv:2412.10961}, year={2024} }
@inproceedings{ju2024can, title={Can We Theoretically Quantify the Impacts of Local Updates on the Generalization Performance of Federated Learning?}, author={Ju, Peizhong and Yang, Haibo and Liu, Jia and Liang, Yingbin and Shroff, Ness}, booktitle={Proceedings of the Twenty-fifth International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing (MobiHoc'24)}, pages={141--150}, year={2024} }
@inproceedings{li2024efficient, title={Efficient Multi-dimensional Compression for Network-edge Classification}, author={Li, Chengzhang and Ju, Peizhong and Eryilmaz, Atilla and Shroff, Ness}, booktitle={Proceedings of the Twenty-fifth International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing (MobiHoc'24)}, pages={91--100}, year={2024} }
@inproceedings{ ju2024achieving, title={Achieving Fairness in Multi-Agent {MDP} Using Reinforcement Learning}, author={Peizhong Ju and Arnob Ghosh and Ness Shroff}, booktitle={The Twelfth International Conference on Learning Representations}, year={2024}, url={https://openreview.net/forum?id=yoVq2BGQdP} }
@inproceedings{liachieving, title={Achieving Sample and Computational Efficient Reinforcement Learning by Action Space Reduction via Grouping}, author={Li, Yining and Ju, Peizhong and Shroff, Ness}, booktitle={The Twelfth International Conference on Learning Representations}, year={2024}, url={https://openreview.net/forum?id=MOmqfJovQ6} }
@inproceedings{lin2023theory, title={Theory on forgetting and generalization of continual learning}, author={Lin, Sen and Ju, Peizhong and Liang, Yingbin and Shroff, Ness}, booktitle={International Conference on Machine Learning}, pages={21078--21100}, year={2023}, organization={PMLR} }
@inproceedings{jutheoretical, title={Theoretical Characterization of the Generalization Performance of Overfitted Meta-Learning}, author={Ju, Peizhong and Liang, Yingbin and Shroff, Ness}, booktitle={The Eleventh International Conference on Learning Representations}, year={2023}, url={https://openreview.net/forum?id=Jifob4dSh99} }
@article{ju2022generalization, title={On the generalization power of the overfitted three-layer neural tangent kernel model}, author={Ju, Peizhong and Lin, Xiaojun and Shroff, Ness}, journal={Advances in neural information processing systems}, volume={35}, pages={26135--26146}, year={2022} }
@inproceedings{ju2022distribution, title={Distribution-level markets under high renewable energy penetration}, author={Ju, Peizhong and Lin, Xiaojun and Huang, Jianwei}, booktitle={Proceedings of the Thirteenth ACM International Conference on Future Energy Systems}, pages={127--156}, year={2022} }
@inproceedings{ju2021generalization, title={On the generalization power of overfitted two-layer neural tangent kernel models}, author={Ju, Peizhong and Lin, Xiaojun and Shroff, Ness}, booktitle={International Conference on Machine Learning}, pages={5137--5147}, year={2021}, organization={PMLR} }
@article{ju2020overfitting, title={Overfitting can be harmless for basis pursuit, but only to a degree}, author={Ju, Peizhong and Lin, Xiaojun and Liu, Jia}, journal={Advances in Neural Information Processing Systems}, volume={33}, pages={7956--7967}, year={2020} }
@inproceedings{ju2018adversarial, title={Adversarial attacks to distributed voltage control in power distribution networks with DERs}, author={Ju, Peizhong and Lin, Xiaojun}, booktitle={Proceedings of the Ninth International Conference on Future Energy Systems}, pages={291--302}, year={2018} }
Feel free to reach out to me by email:
You can also find me on Google Scholar:
https://scholar.google.com/citations?user=VDzpfOYAAAAJ&hl=en
My ORCID iD:
https://orcid.org/0000-0002-4569-3539