# SPO outperforms PPO in all environments when the network deepens: ![MuJoCo](https://github.com/MyRepositories-hub/Simple-Policy-Optimization/blob/main/draw_return_mujoco.png)