Algorithms for Continuous Optimization , Marc Teboulle
Algorithms for Continuous Optimization - 0365.4414
Time and Place
Spring Semester, 2018/2019 - Place ... Room , Sunday 16-19PM
The course will provide an up-to-date introduction
to modern optimization algorithms. The advances in computer technology
have promoted the field of nonlinear optimization, which has become
today an essential tool to solve intelligently complex scientific and
THIS COURSE IS A MUST TO ANYONE WHO WANTS:
Graduate students (M.Sc, Ph.D) from Computer Science, Electrical Engineering, Mathematics, and
Statistics are strongly encouraged to register.
- to understand the underlying mathematical foundation to design and analyze modern algorithms
- to learn the tools for convergence and complexity analysis of algorithms
- to solve large/huge scale optimization problems arising in modern science and engineering.
A student who had no previous exposure in continuous optimization might still be admitted to the course, provided she/he meets the
following two conditions:
- Formal Prerequisite: 0365-4409 -- Convex Analysis and Optimization.
- has a solid mathematical background, [in particular in Analysis and Linear Algebra/Matrix Theory]
- has requested and obtained formal approval from the Lecturer to attend the course.
The final grade will be based on a set of assignments/project.
Some Useful References
However, the students are strongly encouraged to consult
the following references (and in particular -, and )
for further reading and study.
- There is no textbook for the course.
- The material will be based on some of the references below and recent advances in the field from research papers.
- Some handouts, and all the Lecture Notes will be distributed.
More references will also be given during the lectures.
Smooth Unconstrained Optimization:
Classical algorithms and methods of analysis.
Descent methods. Line search techniques. Newton's type
methods, Conjugate Gradients. Rate of convergence Analysis.
Constrained Optimization :
Penalty-Barrier methods, Augmented Lagrangian methods.
First Order Methods for Huge Scale Convex Problems and Complexity Analysis: Gradient/Subgradient,
Fast Proximal-Gradient Schemes, Complexity Analysis, Smoothing methods.
Lagrangian methods for convex optimization: Decomposition schemes for large scale problems;
Nonquadratic proximal schemes.
Self-Concordance Theory and Complexity Analysis: Self-concordant functions. Polynomial Interior Point Algorithms.
Newton's Method Revisited.
Semidefinite and Conic Programming : Theory, polynomial algorithms, and
applications to combinatorial optimization problems and engineering.
Modern Applications in Science and Engineering: Image and Signal Processing, Machine Learning,
Sensor Networks Localization problems, etc...
Throughout the course, we will discuss several prototype optimization models in these applied areas and the
relevant algorithms studied in the course that can be applied toward their efficient solutions.