Mingyi Hong

profilePic 

Mingyi Hong

Assistant Professor
Electrical and Computer Engineering
University of Minnesota
6-109 Keller Hall
University of Minnesota, Minneapolis, MN 55455
Google Scholar citation
Biographical Sketch, [Curriculum Vitae]
Email: mhong at umn.edu

Research Interests

My research focus on contemporary issues in machine learning, optimization, information processing and wireless networking.

See here for our publication, and here for the current projects.

Teaching

Special Note

We are looking for students that have strong mathematical background, and are interested in theoretical aspects of large-scale optimization, big data analytics, and/or applications in machine learning, signal processing and networking. Please drop us a line if you are interested in joining our group. You will be placed in either IMSE or ECpE depending your interest/background. Most importantly, convince us that you have the potential!

RA and Postdoctoral position available

We have research assistants and post doctoral fellow position available. If you are interested, please contact Dr. Hong via email

Group News

  • Feb. 2019 working paper: our work (with Songtao, Ioanis and Yongxin) entitled “Hybrid Block Successive Approximation for One-Sided Non-Convex Min-Max Problems: Algorithms and Applications” has been; available at [arXiv];

  • Feb. 2019 three papers accepted by ICASSP 2019;

  • Jan. 2019 paper accepted: our work (with Davood) entitled “Perturbed Proximal Primal Dual Algorithm for Nonconvex Nonsmooth Optimization” has been accepted by MP Series B;

  • Dec. 2018 paper accepted: our work (with Xiangyi, Sijia and Ruoyu) entitled “On the Convergence of A Class of Adam-Type Algorithms for Non-Convex Optimization” has been accepted by ICLR 2019; available at [openreview.net];

This paper studies a class of adaptive gradient based momentum algorithms that update the search directions and learning rates simultaneously using past gradients. We develop an analysis framework and a set of mild sufficient conditions that guarantee the convergence of the Adam-type methods, with a convergence rate of order O(log(T)/sqrt(T)) for non-convex stochastic optimization. We show that the conditions are essential, by identifying concrete examples in which violating the conditions makes an algorithm diverge. Besides providing one of the first comprehensive analysis for Adam-type methods in the non-convex setting, our results can also help the practitioners to easily monitor the progress of algorithms and determine their convergence behavior.

  • Dec. 2018 paper accepted: our work (with Xiangyi, Sijia and Ping-Yu) entitled “signSGD via Zeroth-Order Oracle” has been accepted by ICLR 2019; available at [openreview.net];

See more for our past news/activities

View My Stats