I wanted to deploy my ML model on web, so i hosted the model on hugging face and wanted to use the api for using it with my own UI.
console error image is added for reference.
The Web framework is not contacting with the HF is the problem here, please help.
1.
root_folder/src/app/llms/mta/page.tsx
const [text, setText] = useState("");
const [temperature, setTemperature] = useState(0.7);
const [maxLength, setMaxLength] = useState(50);
const [model, setModel] = useState("gpt2");
const [loading, setLoading] = useState(false);
const [generatedText, setGeneratedText] = useState("");
const handleSubmit = async (e: { preventDefault: () => void; }) => {
e.preventDefault();
setLoading(true);
try {
const response = await fetch('../../api/get-text', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
text,
temperature,
maxLength,
model,
}),
});
if (!response.ok) {
throw new Error(`HTTP error! Status: ${response.status}`);
}
const result = await response.json();
setGeneratedText(result);
toast.success("Text generated successfully!");
} catch (error) {
console.error(error);
toast.error("An error occurred while generating text.");
}
setLoading(false);
};
root_folder/src/app/api/get-text.js
import { Client } from "@gradio/client";
require('dotenv').config();
export default async function handler(req, res) {
if (req.method !== 'POST') {
res.status(405).json({ message: 'Method not allowed' });
return;
}
const { text, temperature, maxLength, model } = req.body;
try {
const client = await Client.connect("hexronus/portfolio", {
hf_token: `hf_${process.env.MTAText}`
});
const result = await client.predict("/predict", {
text: text || 'Hi',
temperature: temperature || 0.7,
maxLength: maxLength || '100',
model: model || 'gpt2',
});
res.status(200).json(result.data);
} catch (error) {
console.error(error);
res.status(500).json({ message: 'An error occurred while generating text.' });
}
}
Hugging Face API doc
API Name: /predict
import { Client } from "@gradio/client";
const client = await Client.connect("hexronus/portfolio");
const result = await client.predict("/predict", {
text: "Hello!!",
temperature: 0.1,
maxLength: 10,
model: "gpt2",
});
console.log(result.data);
Parameters
-
text (string, Required)
- The input value that is provided in the “Input Text” Textbox component.
-
temperature (number, Default: 0.7)
- The input value that is provided in the “Temperature” Slider component.
-
maxLength (number, Default: 50)
- The input value that is provided in the “Max Length” Slider component.
-
model (string, Required)
- The input value that is provided in the “Model” Dropdown component.
Returns
- string
- The output value that appears in the “Generated Text” Textbox component.
First i tried all sorts of data format matching, i thought that may be the problem, then i checked that if my codebase at HF is faulty and the code may not be working there, but it is working, i ran the spaces gradio api for .py in colab and it worked fine but just i am not getting how to use it as an api in nextjs, also when added with page.tsx file in the client render the parameter taking part is causing problems as when we pass params by user we use the “use client” method which won’t be supported as it is an api call.
hexronus is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.