Radu Ioan Boț
Radu Ioan Boţ | |
|---|---|
| Occupations | Mathematician and academic |
| Academic background | |
| Education | Diploma in Mathematics Master of Science in Mathematics Doctor of Natural Sciences Habilitation |
| Alma mater | Babeş-Bolyai University Cluj-Napoca Chemnitz University of Technology |
| Academic work | |
| Institutions | University of Vienna |
Radu Ioan Boţ is a Romanian mathematician and an academic. He is a Professor as well as the Dean of the Faculty of Mathematics and Head of the Institute of Mathematics at the University of Vienna.[1]
Boţ is known for his works on convex analysis, convex optimization, nonsmooth optimization, and monotone operators. His works have been published in academic journals such as SIAM Journal on Optimization, Mathematical Programming, Foundations of Computational Mathematics, Journal of Differential Equations and Journal of the European Mathematical Society.[2]
Education
Boţ obtained his Diploma and Master of Science in Mathematics from Babeş-Bolyai University, Cluj-Napoca, in 1998 and 1999, respectively, followed by a Ph.D. from Chemnitz University of Technology in 2003. In 2008, he completed his Habilitation (Dr. rer. nat. habil.), and in 2009, he was awarded the title of Privatdozent (PD) from the same institution.[1]
Career
Boţ began his academic career at the Faculty of Mathematics, Chemnitz University of Technology, where he served from 2003 to 2010 and from 2011 to 2013. He held an appointment as Professor of Applied Mathematics at Heinrich Heine University Düsseldorf from 2010 to 2011. Between 2014 and 2017, he was an Associate Professor in the Faculty of Mathematics at the University of Vienna, where he has been Professor of Applied Mathematics with Emphasis on Optimization since 2017.[1]
In addition to his academic roles, Boţ has undertaken several administrative appointments. He was Vice Dean for Research of the Faculty of Mathematics and Deputy Head of the Institute of Mathematics from 2016 to 2020. He also worked as the Speaker of the Vienna School of Mathematics at the University of Vienna. From 2020 to 2025, he was the Speaker of the ‘’Vienna Graduate School on Computational Optimization”, which was funded by the Austrian Science Fund. Since 2020, he has been Dean of the Faculty of Mathematics and Head of the Institute of Mathematics at the University of Vienna. Starting in January 2026, he will work as Editor-in-Chief of the SIAM Journal on Optimization.[1]
Research
In his early research, Boţ developed and compared various dual optimization problems using the Fenchel–Rockafellar approach for convex programs with inequality constraints, studied relations among them, and established strong duality and optimality conditions.[3] Subsequently, he extended Farkas’ Lemma to systems with finite and infinite convex constraints, using two duality frameworks: extended Fenchel and Fenchel–Lagrange duals, and showed strong duality results that unified and generalized classical convex optimization theory.[4] Together with Wanka, he introduced a weaker conjugate epigraph-based regularity condition that ensured Fenchel duality in infinite-dimensional optimization and was applied to the strong conical hull intersection property of convex sets.[5] In 2009, he co-authored the book titled Duality in Vector Optimization, wherein he focused on duality theory in vector optimization, filling a literature gap by providing a comprehensive research-oriented treatment intended for researchers and graduate students in mathematics and optimization.[6] In the subsequent year, he authored the book Conjugate Duality in Convex Optimization. The book explored advanced convex optimization, focusing on conjugate duality, regularity conditions, biconjugate calculus, and Fenchel duality, while developing new duality frameworks and applying them to monotone operators for researchers and graduate students.[7] He also proposed primal-dual splitting algorithms for solving inclusions involving mixtures of composite and parallel-sum type monotone operators, which rely on an inexact Douglas-Rachford method, and investigated their convergence properties.[8]
In collaboration with Csetnek and Hendrich, Boţ developed and analyzed an inertial Douglas–Rachford splitting algorithm for monotone inclusions, proved convergence, extended it to structured operators, and demonstrated applications to primal–dual convex optimization with numerical experiments in clustering and location theory.[9] He also introduced a proximal minimization algorithm for structured nonconvex and nonsmooth optimization problems, establishing subsequential convergence of the iterates, and—under the Kurdyka-Łojasiewicz framework—full sequential convergence.[10] In collaboration with Nguyen, he introduced an inertial continuous-time model with an asymptotically vanishing term for minimizing continuously differentiable convex functions under linear equality constraints, proving fast convergence of the primal-dual gap, feasibility measure, and objective value, together with weak convergence of the trajectory to a primal-dual optimal solution.[11] He also addressed minimax optimization in machine learning, proving novel global convergence guarantees for stochastic alternating gradient descent ascent in challenging non-convex–non-concave settings, relevant to training GANs and adversarial models.[12] More recently, he introduced a Fast Optimistic Gradient Descent Ascent (OGDA) method, in both continuous and discrete time, establishing convergence of the generated trajectories and iterates while achieving the best-known convergence rates among schemes for solving monotone equations.[13]
Bibliography
Books
- Duality in Vector Optimization (2009) ISBN 9783642028854
- Conjugate Duality in Convex Optimization (2010) ISBN 9783642048999
Selected articles
- Boţ, R. I., Csetnek, E. R., & Hendrich, C. (2015). Inertial Douglas–Rachford splitting for monotone inclusion problems. Applied Mathematics and Computation, 256, 472-487.
- Boţ, R. I., Csetnek, E. R., & László, S. C. (2020). A primal-dual dynamical approach to structured convex minimization problems. Journal of Differential Equations 269(12), 10717-10757.
- Boţ, R. I., Dong, G., Elbau, P., & Scherzer, O. (2022). Convergence rates of first and higher order dynamics for solving ill-posed problems. Foundations of Computational Mathematics, 22(5), 1567-1629.
- Attouch, H., Boţ, R. I., & Csetnek, E. R. (2023). Fast optimization via inertial dynamics with closed-loop damping. Journal of the European Mathematical Society 25(5), 1985-2056.
- Boţ, R. I., Nguyen, D.-K. (2023). Fast Krasnosel'skii-Mann algorithm with a convergence rate of the fixed point iteration of o(1/k). SIAM Journal on Numerical Analysis 61(6), 2813-2843.
- Boţ, R. I., Csetnek, E. R., & Nguyen, D.-K. (2025). Fast Optimistic Gradient Descent Ascent (OGDA) method in continuous and discrete time. Foundations of Computational Mathematics 25(1), 162-222.
References
- ^ a b c d "Radu Ioan Boţ - Curriculum Vitae".
- ^ "Radu Ioan Bot - Google Scholar".
- ^ Wanka, Gert; Boţ, Radu-Ioan (2002). "On the Relations Between Different Dual Problems in Convex Mathematical Programming". Operations Research Proceedings 2001. Springer: 255–262. doi:10.1007/978-3-642-50282-8_32.
- ^ Bot, Radu Ioan; Wanka, Gert (January 2005). "Farkas-Type Results With Conjugate Functions". SIAM Journal on Optimization. 15 (2): 540–554. doi:10.1137/030602332. ISSN 1052-6234.
- ^ Boţ, Radu Ioan; Wanka, Gert (15 June 2006). "A weaker regularity condition for subdifferential calculus and Fenchel duality in infinite dimensional spaces". Nonlinear Analysis: Theory, Methods & Applications. 64 (12): 2787–2804. doi:10.1016/j.na.2005.09.017. ISSN 0362-546X.
- ^ "Duality in Vector Optimization".
- ^ "Conjugate Duality in Convex Optimization".
- ^ Boţ, Radu Ioan; Hendrich, Christopher (January 2013). "A Douglas--Rachford Type Primal-Dual Method for Solving Inclusions with Mixtures of Composite and Parallel-Sum Type Monotone Operators". SIAM Journal on Optimization. 23 (4): 2541–2565. doi:10.1137/120901106. ISSN 1052-6234.
- ^ Boţ, Radu Ioan; Csetnek, Ernö Robert; Hendrich, Christopher (1 April 2015). "Inertial Douglas–Rachford splitting for monotone inclusion problems". Applied Mathematics and Computation. 256: 472–487. doi:10.1016/j.amc.2015.01.017. ISSN 0096-3003.
- ^ Boţ, Radu Ioan; Csetnek, Ernö Robert; Nguyen, Dang-Khoa (January 2019). "A Proximal Minimization Algorithm for Structured Nonconvex and Nonsmooth Problems". SIAM Journal on Optimization. 29 (2): 1300–1328. arXiv:1805.11056. doi:10.1137/18M1190689. ISSN 1052-6234.
- ^ Boţ, Radu Ioan; Nguyen, Dang-Khoa (5 December 2021). "Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping". Journal of Differential Equations. 303: 369–406. arXiv:2106.12294. doi:10.1016/j.jde.2021.09.021. ISSN 0022-0396.
- ^ Boţ, Radu Ioan; Böhm, Axel (30 September 2023). "Alternating Proximal-Gradient Steps for (Stochastic) Nonconvex-Concave Minimax Problems". SIAM Journal on Optimization. 33 (3): 1884–1913. doi:10.1137/21M1465470. ISSN 1052-6234.
- ^ Boţ, Radu Ioan; Csetnek, Ernö Robert; Nguyen, Dang-Khoa (1 February 2025). "Fast Optimistic Gradient Descent Ascent (OGDA) Method in Continuous and Discrete Time". Foundations of Computational Mathematics. 25 (1): 163–222. doi:10.1007/s10208-023-09636-5. ISSN 1615-3383.