I have 2 PCs and on both I have upgraded Ubuntu to 24.04. On one PC (OLD PC) it went smoothly and on the other PC (NEW PC) it didn’t go well, so I had to reinstall a bunch of stuff including CUDA and CUDA Toolkits and my training envs.
Currently on the NEW PC I always have to export LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libstdc++.so.6
and then run train.py
otherwise I get this error (I actually use shell script because including LD_PRELOAD into .bashrc
or directly to the train.py
didn’t work):
ImportError: /home/anya/anaconda3/envs/triplane/lib/python3.9/site-packages/torch/lib/../../../.././libstdc++.so.6: version `GLIBCXX_3.4.32' not found (required by /home/anya/.cache/torch_extensions/py39_cu124/bias_act_plugin/b46266ff65f9fa53c32108953a1c6f16-nvidia-geforce-rtx-4090/bias_act_plugin.so)
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/home/anya/anaconda3/envs/triplane/lib/python3.9/multiprocessing/popen_fork.py", line 27, in poll
pid, sts = os.waitpid(self.pid, flag)
File "/home/anya/anaconda3/envs/triplane/lib/python3.9/site-packages/torch/utils/data/_utils/signal_handling.py", line 67, in handler
_error_if_any_worker_fails()
RuntimeError: DataLoader worker (pid 1203935) is killed by signal: Terminated.
- example of putting LD_PRELOAD into
.bashrc
:
echo returns correct value but training still doesn’t work
I think that downgrading some library might help me actually to avoid defining LD_PRELOAD because I don’t use LD_PRELOAD on my OLD PC at all. But I am not sure which library exactly I should try to sowngrade etc.
I tried to install CUDA Toolkit 11.7 to match my OLD PC, but somehow I always ended with a higher version….
Some additional information about both PCs
-
OLD PC (which works without LD_PRELOAD)
- system: Ubuntu 24.04
- GPU: 2 GPUs GTX 1080 Ti
nvidia-smi
output: CUDA Version: 12.2nvcc -V
output: 11.7
-
NEW PC (this one needs the export of LD_PRELOAD)
- system: Ubuntu 24.04
- GPU: 1 GPU RTX 4090
nvidia-smi
output: CUDA Version: 12.2nvcc -V
output: 12.0
Anya is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.