cv
Here is my CV. You can download a pdf version at the right.
Basics
| Name | Egor Petrov |
| Label | Machine Learning Researcher |
| petrov.egor.d@phystech.edu | |
| Url | https://moderntalker.github.io |
| Summary | Machine Learning Research Resident at Yandex Research. Research on efficient deep learning, scaling laws, optimization, and large-scale training systems. |
Work
-
2025.07 - Present ML Researcher
Yandex Research
ML Research Residency, Moscow
- Spearheading research on asynchronous pipeline parallelism for efficient large-scale model training.
- Designing and analyzing novel algorithms for highly distributed systems to enhance scalability and performance.
-
2024.08 - 2025.01 ML Engineer
Yandex
Personalization Quality R&D Group, Moscow
- Engineered a production pipeline integrating Vowpal Wabbit-based online features into a large-scale CatBoost model, achieving a 0.1% uplift in core ranking metrics (AUC, nDCG).
- Implemented a parallelized, MapReduce-style data converter that accelerated data processing by 20x and reduced model training time from hours to minutes.
-
2024.02 - Present ML Researcher
Laboratory of Mathematical Methods for Optimization, MIPT
Moscow
- Pioneered memory-efficient zeroth-order optimization methods for fine-tuning LLMs, achieving a 50% reduction in memory footprint.
- Developed and theoretically analyzed novel stochastic and zeroth-order algorithms for decentralized optimization, validated on variety of domains.
Education
-
2022.09 - 2026.06 Moscow
Publications
-
2026 Closing the Curvature Gap: Full Transformer Hessians and Their Implications for Scaling Laws
Submitted to ICLR
Egor Petrov, Nikita Kiselev, Vladislav Meshkov, Andrey Grabovoy.
-
2026 Sign-SGD is the Golden Gate between Multi-Node to Single-Node Learning...
Submitted to ICLR
D. Medyakov, S. Stanko, G. Molodtsov, P. Zmushko, G. Evseev, E. Petrov, et al.
-
2025 When Extragradient Meets PAGE: Bridging Two Giants to Boost Variational Inequalities
Conference on Uncertainty in Artificial Intelligence (UAI)
G. Molodtsov, V. Parfenov, E. Petrov, E. Grigoriy, D. Medyakov, A. Beznosikov.
-
2025 Leveraging Coordinate Momentum in SignSGD and Muon: Memory-Optimized Zero-Order LLM Fine-Tuning
ICML Workshop on Tiny Titans
E. Petrov, G. Evseev, A. Antonov, A. Veprikov, P. Plyusnin, N. Bushkov, et al.
-
2024 Zero Order Algorithm for Decentralized Optimization Problems
AI Journey 2024. Published in Doklady Mathematics
A. Veprikov, E. Petrov, G. Evseev, A. Beznosikov.
-
2024 Shuffling Heuristic in Variational Inequalities: Establishing New Convergence Guarantees
ICOMP 2024. Submitted to Q1 Journal JOTA
D. Medyakov, G. Molodtsov, E. Grigoriy, E. Petrov, A. Beznosikov. Awarded 1st place at Neuroinformatics 2024.
-
Sampling of Semi-Orthogonal Matrices for the Muon Algorithm
Submitted to Q2 Journal Doklady Mathematics
E. D. Petrov, G. V. Evseev, A. V. Antonov, A. S. Veprikov, N. A. Bushkov, S. V. Moiseev, A. N. Beznosikov.
Interests
| Efficient Deep Learning |
| Optimization |
| Theory of Learning |
| Scaling Laws |
Projects
- - Present
Lead Developer, ZO-Library
Engineered a comprehensive open-source library for zeroth-order (ZO) optimization, implementing a wide range of state-of-the-art algorithms. Designed with a user-friendly, torch.optim-style API for seamless integration into existing ML pipelines.
- GitHub & Publication Forthcoming
- 2024.01 - Present
Teaching Assistant & Mentor
Mentored undergraduate students in Algorithms, Data Structures (MIPT), and Introduction to AI (Central University), leading project-based learning and providing technical guidance.
- 2024 – Present