I’m instantiating a CodeBert Model using AutoModel.fromPretrained.
File "/public.hpc/codeBertConcat/./codeBertConcatEvaluation.py", line 278, in <module>
model = CodeBERTConcatenatedClass(num_classes=NUM_CLASSES).to(DEVICE)
File "/public.hpc/codeBertConcat/./codeBertConcatEvaluation.py", line 137, in __init__
self.codebert = AutoModel.from_pretrained('microsoft/codebert-base', cache_dir="./cache2")
File "/public.hpc/codeBertConcat/venv/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 493, in from_pretrained
return model_class.from_pretrained(
File "/public.hpc/codeBertConcat/venv/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2903, in from_pretrained
) = cls._load_pretrained_model(
File "/public.hpc/codeBertConcat/venv/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3061, in _load_pretrained_model
id_tensor = id_tensor_storage(tensor) if tensor.device != torch.device("meta") else id(tensor)
RuntimeError: Expected one of cpu, cuda, xpu, mkldnn, opengl, opencl, ideep, hip, msnpu, xla, vulkan device type at start of device string: meta
Someone has a clue?
I’m using Transformers version 4.31.0 and PyTorch 1.8.1+cu111