The course is intended as both a mathematical introduction to Convex Optimization, which also lies at the basis of much of modern Machine Learning, and a practical tour of contemporary methods in large scale optimization, including stochastic optimization/learning. We will rigorously define and study the complexity of optimization problems, providing both algorithms and upper bounds, as well as lower bound analysis. We will be particularly interested in big data / large scale regimes, and accordingly focus on first order methods and stochastic methods.
The course will complement CS236330 "Introduction to Optimization", in that it will focus more on rigorous analysis of optimization problems, and thus only convex optimization, and in that it will focus on first-order large-scale methods, and encompass also online and stochastic optimization and learning. Taking CS236330 before this course will allow students to gain a broader view and more insight into convex optimization, but is not required---we will define and develop convex optimization from the ground up.
Pre-Requisites: Linear Algebra, Calculus 2, Algorithms 1, Probability.
The course will complement CS236330 "Introduction to Optimization", in that it will focus more on rigorous analysis of optimization problems, and thus only convex optimization, and in that it will focus on first-order large-scale methods, and encompass also online and stochastic optimization and learning. Taking CS236330 before this course will allow students to gain a broader view and more insight into convex optimization, but is not required---we will define and develop convex optimization from the ground up.
Pre-Requisites: Linear Algebra, Calculus 2, Algorithms 1, Probability.