Machine Learning

CoverPhoto 

Synopsis

Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today’s machine learning models call for the reassessment of existing assumptions, such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods.

(image source: closeupengineering, text source: Sra, Suvrit, Sebastian Nowozin, and Stephen J. Wright. Optimization for machine learning. MIT Press, 2012.)

Publications

  1. Meisam Razaviyayn, Mingyi Hong, Zhi-Quan Luo and Jong-Shi Pang, “Parallel Successive Convex Approximation for Nonsmooth Nonconvex Optimization”, Proc. NIPS 2014, (acceptance rate %24.67); [C code]; [arxiv]

  2. Ruoyu Sun* and Mingyi Hong*, “Improved Iteration Complexity Bounds of Cyclic Block Coordinate Descent for Convex Problems”, Proc. NIPS 2015 (* equal contribution, acceptance rate %21.92); available [here]

  3. Davood Hajinezhad, Mingyi Hong, Tuo Zhao, and Zhaoran Wang. “NESTT: A Nonconvex Primal-Dual Splitting Method for Distributed and Stochastic Optimization.”; available [arXiv.org]

  4. Xingguo Li, Jarvis Haupt, Raman Arora, Han Liu, Mingyi Hong, and Tuo Zhao. “A First Order Free Lunch for SQRT-Lasso.”; available [arXiv.org]

  5. Qingjiang Shi, Haoran Sun, Songtao Lu, Mingyi Hong and Meisam Razaviyayn. “Inexact Block Coordinate Descent Methods For Symmetric Nonnegative Matrix Factorization”;

  6. Brendan Ames and Mingyi Hong, “Alternating direction method of multipliers for sparse zero-variance discriminant analysis and principal component analysis”, To appear in Computational Optimization and Applications. [R code][MATLAB code][arxiv]

  7. Xiao Fu, Kejun Huang, Mingyi Hong, Nicholas D. Sidiropoulos, and Anthony Man-Cho So. "Scalable and Optimal Generalized Canonical Correlation Analysis via Alternating Optimization.”