I am trying the following code for entity extraction using Azure OpenAI endpoint. The idea of using LLM is that I can leverage language understanding for more evolved use cases over time.
from langchain.chains import create_extraction_chain
from langchain_openai import AzureChatOpenAI
azure_api_key = "key comes here"
azure_endpoint = "endpoint comes here"
MODEL_NAME = "gpt-35-turbo"
api_version = "2023-09-15-preview"
import os
os.environ["AZURE_OPENAI_API_KEY"] = azure_api_key
os.environ["AZURE_OPENAI_ENDPOINT"] = azure_endpoint
llm = AzureChatOpenAI(
azure_deployment=MODEL_NAME,
openai_api_version=api_version)
schema={
"properties": {
"name":{"type":"string"},
"age":{"type":"integer"},
},
"required":["name","age"],
}
chain=create_extraction_chain(schema,llm)
inp = """Return the information in JSON format: Names, age. For example, '{"name": "John", "age": 7, }'. Now, Peter is 5 years old, Agatha is twice his age and Charles is 1 year younger than Agatha."""
result = chain.invoke(inp)
I get an error:
JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 3 (char 4)
During handling of the above exception, another exception occurred:
OutputParserException Traceback (most recent call last)
OutputParserException: Could not parse function call data: Expecting property name enclosed in double quotes: line 2 column 3 (char 4)
I have tried doing a try-catch, but looks like the issue arises while passing the schema.