For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Abstract at the OPTIMAL Conference, December 19, 2023 Speaker: Hadi Abbaszadehpeivasti (Tilburg University) Title: Performance analysis of optimization methods for machine learning

Abstract:

First-order optimization methods play a crucial role in efficiently tuning parameters in machine learning problems. Investigating the performance of these algorithms holds significant importance as it offers insights into their behavior for different problem classes, helping researchers and practitioners in selecting the most suitable algorithm for their specific problem. Additionally, the study of performance of first-order methods helps for developing new algorithms and optimizing step-lengths of an algorithm. In this presentation, we give a comprehensive review of convergence rates for a diverse range of optimization algorithms, as outlined in six research papers. These papers include a wide range of methods, from the difference-of-convex algorithm (DCA), to gradient descent and the alternating direction method of multipliers (ADMM). Each paper mainly delves into the performance and convergence rate of these optimization methods in various scenarios. In general, this body of work contributes significantly to advancing our understanding of optimization algorithm behaviors, providing valuable insights that can be applied in practical applications. The papers in question are joint work with Moslem Zamani and Etienne de Klerk.