Egor Petrov

Machine Learning Researcher at Yandex Research

photo1_crop.jpg

ML Research Residency, Yandex Research

Moscow Institute of Physics and Technology (MIPT)

My research began under the supervision of Alexander Beznosikov, where I developed algorithms for variational inequalities and parameter-free optimization, then shifted to memory-efficient and zeroth-order (ZO) methods. These contributions led to the development of ZO-Library—a PyTorch-style open-source framework for ZO optimization in LLM fine-tuning.

My bachelor thesis, supervised by Andrey Grabovoy, provides the first complete analytical Hessian for LayerNorm and feedforward sublayers, completing the second-order characterization of the full Transformer block.

Currently, at Yandex Research under Artem Babenko, and in collaboration with Samuel Horváth, I work on Asynchronous Pipeline Parallelism and Distributed Multi-Cluster Learning.

selected publications