I’m following this tutorial (https://www.youtube.com/watch?v=D0jTowlMROc) of ml-agents. I couldn’t perform a learning test run of the example (in the video). After typing “mlagents-learn –run-id = …” everything seems to be working fine until i hit “play” in the unity editor. The learning process doesnt seem to work, my agent isnt moving and i get the following error in the unity editor console :
“Unexpected exception when trying to initialize communication: System.IO.IOException: Error loading native library “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkLibraryPackageCache[email protected]PluginsProtoBufferruntimes/win/nativegrpc_csharp_ext.x64.dll”
at Grpc.Core.Internal.UnmanagedLibrary..ctor (System.String[] libraryPathAlternatives) [0x00063] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.Internal.NativeExtension.Load () [0x000d7] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.Internal.NativeExtension..ctor () [0x00006] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.Internal.NativeExtension.Get () [0x00022] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.Internal.NativeMethods.Get () [0x00000] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.GrpcEnvironment.GrpcNativeInit () [0x00000] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.GrpcEnvironment..ctor () [0x0001e] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.GrpcEnvironment.AddRef () [0x00028] in <2f154ad39ec14cfea604815989d96352>:0
at Grpc.Core.Channel..ctor (System.String target, Grpc.Core.ChannelCredentials credentials, System.Collections.Generic.IEnumerable1[T] options) [0x00051] in <2f154ad39ec14cfea604815989d96352>:0 at Grpc.Core.Channel..ctor (System.String target, Grpc.Core.ChannelCredentials credentials) [0x00000] in <2f154ad39ec14cfea604815989d96352>:0 at Unity.MLAgents.RpcCommunicator.Initialize (System.Int32 port, Unity.MLAgents.CommunicatorObjects.UnityOutputProto unityOutput, Unity.MLAgents.CommunicatorObjects.UnityInputProto& unityInput) [0x00007] in .LibraryPackageCache[email protected]RuntimeCommunicatorRpcCommunicator.cs:224 at Unity.MLAgents.RpcCommunicator.Initialize (Unity.MLAgents.CommunicatorInitParameters initParameters, Unity.MLAgents.UnityRLInitParameters& initParametersOut) [0x0003b] in .LibraryPackageCache[email protected]RuntimeCommunicatorRpcCommunicator.cs:112 UnityEngine.Debug:Log (object) Unity.MLAgents.RpcCommunicator:Initialize (Unity.MLAgents.CommunicatorInitParameters,Unity.MLAgents.UnityRLInitParameters&) (at ./Library/PackageCache/[email protected]/Runtime/Communicator/RpcCommunicator.cs:141) Unity.MLAgents.Academy:InitializeEnvironment () (at ./Library/PackageCache/[email protected]/Runtime/Academy.cs:445) Unity.MLAgents.Academy:LazyInitialize () (at ./Library/PackageCache/[email protected]/Runtime/Academy.cs:279) Unity.MLAgents.Academy:.ctor () (at ./Library/PackageCache/[email protected]/Runtime/Academy.cs:248) Unity.MLAgents.Academy/<>c:<.cctor>b__83_0 () (at ./Library/PackageCache/[email protected]/Runtime/Academy.cs:117) System.Lazy
1<Unity.MLAgents.Academy>:get_Value ()
Unity.MLAgents.Academy:get_Instance () (at ./Library/PackageCache/[email protected]/Runtime/Academy.cs:132)
Unity.MLAgents.DecisionRequester:Awake () (at ./Library/PackageCache/[email protected]/Runtime/DecisionRequester.cs:57)
“
After waiting a few seconds, i get the following error in cmd :
“Version information:
ml-agents: 0.30.0,
ml-agents-envs: 0.30.0,
Communicator API: 1.5.0,
PyTorch: 2.3.1+cpu
C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagestorch_init_.py:749: UserWarning: torch.set_default_tensor_type() is deprecated as of PyTorch 2.1, please use torch.set_default_dtype() and torch.set_default_device() as alternatives. (Triggered internally at C:actions-runner_workpytorchpytorchbuilderwindowspytorchtorchcsrctensorpython_tensor.cpp:433.)
_C._set_default_tensor_type(t)
[INFO] Listening on port 5004. Start training by pressing the Play button in the Unity Editor.
Traceback (most recent call last):
File “C:UsersismaiAppDataLocalProgramsPythonPython39librunpy.py”, line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File “C:UsersismaiAppDataLocalProgramsPythonPython39librunpy.py”, line 87, in run_code
exec(code, run_globals)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvScriptsmlagents-learn.exe_main.py”, line 7, in
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerslearn.py”, line 264, in main
run_cli(parse_command_line())
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerslearn.py”, line 260, in run_cli
run_training(run_seed, options, num_areas)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerslearn.py”, line 136, in run_training
tc.start_learning(env_manager)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagents_envstimers.py”, line 305, in wrapped
return func(*args, **kwargs)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerstrainer_controller.py”, line 172, in start_learning
self._reset_env(env_manager)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagents_envstimers.py”, line 305, in wrapped
return func(*args, **kwargs)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerstrainer_controller.py”, line 105, in _reset_env
env_manager.reset(config=new_config)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainersenv_manager.py”, line 68, in reset
self.first_step_infos = self._reset_env(config)
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerssubprocess_env_manager.py”, line 446, in _reset_env
ew.previous_step = EnvironmentStep(ew.recv().payload, ew.worker_id, {}, {})
File “C:UsersismaiDesktopDéveloppementUnityDevProjectspleaseworkvenvlibsite-packagesmlagentstrainerssubprocess_env_manager.py”, line 101, in recv
raise env_exception
mlagents_envs.exception.UnityTimeOutException: The Unity environment took too long to respond. Make sure that :
The environment does not need user interaction to launch
The Agents’ Behavior Parameters > Behavior Type is set to “Default”
The environment and the Python interface have compatible versions.
If you’re running on a headless server without graphics support, turn off display by either passing –no-graphics option or build your Unity executable as server build.”
From what i’ve seen on other forums and websites, the issue seems to be in the grpc library. Unfortunatly, there isnt much documentation of the issue online, even on the unity forums. So i couldn’t do or try to do anything.
ismail arrouchdi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.