when I use from_pretrained to load a model I encountered the error that I don’t have write permission to “./cache” file specifically when this line is invoked https://github.com/huggingface/transformers/blob/main/src/transformers/dynamic_module_utils.py#L54.
I can set HF_MODULES_CACHE by os.environ[“HF_MODULES_CACHE”] = xxxx to try to bypass this but not sure if this is the most principal way of doing it cause it might have some other dependencies. Here is a related question: How to change huggingface transformers default cache directory