I’m creating a chatbot by using ChatGPT and Flask.
However, the way of using ChatGPT API can’t continue conversation so that I’m finding the solution.
In addition, I don’t accept the method which causes system clash using a lot of memory.
This is the code which I’m using to proceed conversation with ChatGPT through Flask server.
@app.route('/ask', methods=['POST'])
def handle_query():
data = request.json
prompt = data.get('prompt', '')
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are an assistant."},
{"role": "user", "content": prompt}],
api_key=OPENAI_API_KEY
)
gpt_response = response.choices[0]['message']['content']
return jsonify({"response": gpt_response})