I have been building an app in Streamlit Cloud which uses a GitHub repo to execute code. Now I am using a hugging face model in the code. I have the API key and permission granted for meta-3b-instruct
and 8b-instruct
. When I execute the app on Streamlit Cloud it shows this error:
OSError: You are trying to access a gated repo.
Make sure to have access to it at
https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct.
401 Client Error. (Request ID:
Root=1-67548f30-095cd98c1ddbd00c71f600a7;c0521d6a-9a16-4df8-a8b0-bffe17427f8e)
Cannot access gated repo for url
https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct/resolve/main/config.json
Access to model meta-llama/Llama-3.2-3B-Instruct is restricted. You must have
access to it and be authenticated to access it. Please log in.
2024-12-07 18:08:50.428 Examining the path of torch.classes raised: Tried to instantiate class ‘path.path’, but it does not exist! Ensure that it is registered via torch::class
Below is my code, I’m using the LIDA library to build a data analysis app:
text_gen = llm(
provider="hf",
model="meta-llama/Llama-3.2-3B-Instruct",
api_key=hf_api_key,
device_map="auto"
)
lida = Manager(text_gen=text_gen)
textgen_config = TextGenerationConfig(
n=1,
temperature=0,
model="meta-llama/Llama-3.2-3B-Instruct",
use_cache=use_cache
)
st.write("## Summary")
I need help to overcome this issue – the ID is active and I have been granted permission to use the above meta models on hugging face.
Also, I have another question, how much space do I have on Streamlit Cloud to execute my model? If the above problem is solved I don’t know which model I can download. Will anyone need to download the model every time they access this website?
I’m expecting how to overcome this issue and also which model I can use which will best fit the space available on Streamlit Cloud and GitHub.