The Ollama API (a service for using LLM models locally) only accepts POST requests that originate from localhost. As you know. all the code for these extensions runs from chrome-extension://id.
Is there an elegant solution to make this POST request to the API so that its ORIGIN is localhost (127)?
*Both the extension and the Ollama server run on the same machine.
It seems these colleagues manage to do it, but I can’t figure out how:
https://chromewebstore.google.com/detail/ollama-ui/cmgdpmlhgjhoadnonobjeekmfcehffco
https://chromewebstore.google.com/detail/page-assist-a-web-ui-for/jfgfiigpkhlkbnfnbobbkinehhfdhndo
The regular POST:
/ Function to send a POST request to the Ollama
async function postRequest(data) {
const URL = ${ollama_host}/api/generate
;
try {
const response = await fetch(URL, {
method: ‘POST’,
headers: {
‘Content-Type’: ‘application/json’,
},
body: JSON.stringify(data)
});
if (!response.ok) {
const errorData = await response.json(); // Or response.text() if not JSON
document.getElementById('chatlog').innerHTML += `API returned an error: ${errorData.message}`;
}
return response; // Assuming the API returns JSON
} catch (error) {
document.getElementById(‘chatlog’).innerHTML += ‘Failed to post request: ‘+ error;
throw error; // Rethrow or handle as needed
}
}
Luis Hernández is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.