-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial commit Torch_PPO_Cleanrl_Atari_Envpool #243
Conversation
I have made a small PR to your branch here I was able to run the example with
Note: when doing Result of current run:
|
I added the instrumentation for it in my PR. I also updated the argument so the number of envs scales with the number of CPU available per GPUs.
|
instrumentation concept
Hello!
Currently
main.py
is just a copy-paste of the originalcleanrl
script as discussed with Xavier. Should I modify it to use the observer utilities?Also, I have been able to add the requirements and install them with:
milabench install --config dev.yaml --base .
But after that when I run:
milabench run --config dev.yaml --base .
I get that some dependencies cannot be imported as if they hadn't been installed. I think I am not being able to load a shell with the venv I created with the first command.Please let me know how to proceed!