I want to use the batch api . But instead writing the following code by creating a client , I need to use python post request .
<code>batch_input_file = client.files.create(file=open(“batchinput.jsonl”, “rb”),purpose=“batch”)
batch_input_file_id = batch_input_file.id
client.batches.create(input_file_id=batch_input_file_id,endpoint=“/v1/chat/completions”,completion_window=“24h”,metadata={“description”: “nightly eval job”})
</code>
<code>batch_input_file = client.files.create(file=open(“batchinput.jsonl”, “rb”),purpose=“batch”)
batch_input_file_id = batch_input_file.id
client.batches.create(input_file_id=batch_input_file_id,endpoint=“/v1/chat/completions”,completion_window=“24h”,metadata={“description”: “nightly eval job”})
</code>
batch_input_file = client.files.create(file=open(“batchinput.jsonl”, “rb”),purpose=“batch”)
batch_input_file_id = batch_input_file.id
client.batches.create(input_file_id=batch_input_file_id,endpoint=“/v1/chat/completions”,completion_window=“24h”,metadata={“description”: “nightly eval job”})
Following is what I did , but I am getting error : {‘statusCode’: 404, ‘message’: ‘Resource not found’}
<code>data1={'purpose'="batch"}
requests_list=[]id=1
for request in bulk_requests:requests_list.append({"custom_id":str(id),"method":POST,"url":"/v1/chat/completions","body":{"model":model,"messages":request}})id+=1
with open('batchinput.jsonl','w') as file:for request in requests_list:file.write(json.dumps(request)+'n')
myfile={'file':open('batchinput.jsonl'),'rb'} batch_input_file_response=requests.post("https://api.openai.com/v1/files", data=data1,files=myfile)
batch_input_file=batch_input_file_response.json print(batch_input_file)
data2={ "input_file_id": batch_input_file.get('id'), "completion_window":"24h" }
batch_object_output=requests.post("https://api.openai.com/v1/batches",data=data2) print(batch_object_output.json)
</code>
<code>data1={'purpose'="batch"}
requests_list=[]id=1
for request in bulk_requests:requests_list.append({"custom_id":str(id),"method":POST,"url":"/v1/chat/completions","body":{"model":model,"messages":request}})id+=1
with open('batchinput.jsonl','w') as file:for request in requests_list:file.write(json.dumps(request)+'n')
myfile={'file':open('batchinput.jsonl'),'rb'} batch_input_file_response=requests.post("https://api.openai.com/v1/files", data=data1,files=myfile)
batch_input_file=batch_input_file_response.json print(batch_input_file)
data2={ "input_file_id": batch_input_file.get('id'), "completion_window":"24h" }
batch_object_output=requests.post("https://api.openai.com/v1/batches",data=data2) print(batch_object_output.json)
</code>
data1={'purpose'="batch"}
requests_list=[]id=1
for request in bulk_requests:requests_list.append({"custom_id":str(id),"method":POST,"url":"/v1/chat/completions","body":{"model":model,"messages":request}})id+=1
with open('batchinput.jsonl','w') as file:for request in requests_list:file.write(json.dumps(request)+'n')
myfile={'file':open('batchinput.jsonl'),'rb'} batch_input_file_response=requests.post("https://api.openai.com/v1/files", data=data1,files=myfile)
batch_input_file=batch_input_file_response.json print(batch_input_file)
data2={ "input_file_id": batch_input_file.get('id'), "completion_window":"24h" }
batch_object_output=requests.post("https://api.openai.com/v1/batches",data=data2) print(batch_object_output.json)
New contributor
dks is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.