FINANCIAL SUPPORT (``MILGA") FOR FULL TIME PH.D STUDENTS CURRENTLY AVAILABLE.

Interested Candidates, please contact me by email.
Algorithms for Continuous Optimization , Marc Teboulle

Algorithms for Continuous Optimization - 0365.4414

Lecturer: Prof. Marc Teboulle ( teboulle@post.tau.ac.il )
Office: Schreiber Bldg. 227, Phone: 8896
School of Mathematical Sciences
Tel-Aviv University


Time and Place

Spring Semester, 2017/2018 - Room ?, Sunday 16-19PM

Intended Audience and Prerequisites

The course will provide an up-to-date introduction to modern optimization algorithms. The advances in computer technology have promoted the field of nonlinear optimization, which has become today an essential tool to solve intelligently complex scientific and engineering problems.

This course is a must to anyone who needs to Solve Huge Scale Optimization Problems in Modern Science and Engineering.

All graduate students from Computer Sciences, Electrical Engineering, Mathematics, and Statistics are strongly encouraged to register.

Previous exposure to a mathematical optimization course is very suitable, but not a formal preriquisite.

A student who had no previous exposure in continuous optimization might still be admitted to the course, provided she/he has a solid mathematical background, [in particular in Analysis and Linear Algebra/Matrix Theory], or after discussion and consent of the Lecturer.

Course requirements

The final grade will be based on a set of assignments/project.

Some Useful References

There is no textbook for the course. The material will be based on some of the references below and recent advances in the field from research papers.
Some handouts, and all the Lecture Notes will be distributed.
However, the students are strongly encouraged to consult the following references (and in particular [1]-[3], and [5]) for further reading and study. References
More references will be given during the lectures.

Approximate syllabus

Smooth Unconstrained Optimization: Classical algorithms and methods of analysis. Descent methods. Line search techniques. Newton's type methods, Conjugate Gradients. Rate of convergence Analysis.

Constrained Optimization : Penalty-Barrier methods, Augmented Lagrangian methods.

First Order Methods for Huge Scale Convex Problems and Complexity Analysis: Gradient/Subgradient, Fast Proximal-Gradient Schemes, Complexity Analysis, Smoothing methods.

Lagrangian methods for convex optimization: Decomposition schemes for large scale problems; Nonquadratic proximal schemes.

Self-Concordance Theory and Complexity Analysis: Self-concordant functions. Polynomial Interior Point Algorithms. Newton's Method Revisited.

Semidefinite and Conic Programming : Theory, polynomial algorithms, and applications to combinatorial optimization problems and engineering.

Modern Applications in Science and Engineering: Image and Signal Processing, Machine Learning, Sensor Networks Localization problems, etc...
Throughout the course, we will discuss several prototype optimization models in these applied areas and the relevant algorithms studied in the course that can be applied toward their efficient solutions.