Arkadiy Dushatskiy did a Bachelor in Applied Mathematics and Informatics at the Higher School of Economics, Moscow, Russia (2012-2016). He continued his studies in Applied Mathematics and Informatics with a specialization High-Performance Computing at the Moscow State University (2016-2018). His MSc thesis topic was performance analysis of Convolutional Neural Networks running on GPUs.
He started in September 2018 as a PhD student at CWI where his main research topic is a combination of Evolutionary Algorithms and Deep Learning applied to Medical Image Analysis.
Marek Elias is now affiliated with Bocconi University.
Ade Fajemisin is a post-doctoral researcher at the University of Amsterdam and works on the ENW-Groot project OPTIMAL (Optimization for and with Machine Learning). His main research focus is in data-driven optimization, which involves using data and machine learning to determine components of optimisation problems, as well as to facilitate the solution of complex optimisation problems. He is also interested in applying these techniques to solve real-world problems.
Alexander Taveira Blomenhofer is a post-doctoral researcher at CWI, Amsterdam, and works on the ENW-Groot project OPTIMAL (Optimization for and with Machine Learning). His research focuses on the development of new optimization methods for high-dimensional estimation problems and understanding their (real) algebraic geometry. His main research concerns Gaussian mixtures and powers-of-forms decompositions, for which he applies techniques from algebraic geometry and semidefinite optimization.
Alex Wang is now affiliated with Purdue University.
Moslem Zamani is a post-doctoral researcher at UCLouvain, and works on the ENW-Groot project OPTIMAL (Optimization for and with Machine Learning). His research concerns understanding the worst-case behaviour of stochastic gradient methods. To this end, the new approaches, performance estimation using semidefinite optimisation and robust-control analysis, are employed. This research may lead to the development of novel stochastic gradient methods for the training of neural networks.