PPO2
Last updated
Was this helpful?
Last updated
Was this helpful?
Original paper:
Baselines blog post:
python -m baselines.ppo2.run_atari
runs the algorithm for 40M frames = 10M timesteps on an Atari game. See help (-h
) for more options.
python -m baselines.ppo2.run_mujoco
runs the algorithm for 1M frames on a Mujoco environment.