I am using Llama index Text2SQL capabilities to create a query engine that works with Open Dataset available in Dubai. I am using Azure AI Search as my vector store.
the query engine works only when i am creating a new vector store and object index. if i want to retrieve the already stored index from azure ai search, i am getting a key error during query time.
Below is the article that i am following:
https://docs.llamaindex.ai/en/stable/examples/index_structs/struct_indices/SQLIndexDemo/
Below is the code i used to retrieve the already saved index from Azure AI Search.
## Reading previously stored object_node_mapping
file = open('obj_node_mapping.pkl', 'rb')
obj_node_mapping = pickle.load(file)
file.close()
## Defining the search client & vector store
search_client = SearchClient(
endpoint=search_service_endpoint,
index_name=search_service_index_name,
credential=credential)
vector_store = AzureAISearchVectorStore(
search_or_index_client=search_client,
index_management=IndexManagement.VALIDATE_INDEX,
id_field_key="id",
chunk_field_key="chunk",
embedding_field_key="embedding",
embedding_dimensionality=1536,
metadata_string_field_key="metadata",
doc_id_field_key="doc_id",
)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
## Retrieve index
index = VectorStoreIndex(
[],
storage_context=storage_context,
)
## Creating the object index & retriever
object_index = ObjectIndex(
index=index,
object_node_mapping=obj_node_mapping,
)
obj_retriever = object_index.as_retriever(similarity_top_k=3)
## Initializing the query engine
query_engine = SQLTableRetrieverQueryEngine(
sql_database, obj_retriever
)
## Querying
response = query_engine.query("What is the contribution of manufacturing sector towards GDP over the years.", )
print(response.response)
When i ask a question during index retrieval stage, it fails and throws a KeyError. but the same question works during index creation stage.
Azharuddin Kazi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.