ppo-LunarLander-v2 / config.json

Commit History

Upload my first LunarLander-v2 model trained with PPO
ce12b16

sankar82 commited on