<code>context = gx.get_context()
datasource_config = {
"name": "my_postgres_datasource",
"class_name": "Datasource",
"execution_engine": {
"class_name": "SqlAlchemyExecutionEngine",
# Modifying to check the connection string
"connection_string": f"postgresql+psycopg2://{self.db_params['user']}:{self.db_params['password']}@{self.db_params['host']}:{self.db_params['port']}/{self.db_params['database']}",
},
"data_connectors": {
"default_runtime_data_connector_name": {
"class_name": "RuntimeDataConnector",
"batch_identifiers": ["default_identifier_name"],
},
"default_inferred_data_connector_name": {
"class_name": "InferredAssetSqlDataConnector",
"include_schema_name": True,
},
},
}
context.add_datasource(**datasource_config)
datasource = context.get_datasource("my_postgres_datasource")
***Above is my gx_context in which i have initalized my datasource
batch_request_1 = RuntimeBatchRequest(
datasource_name="my_postgres_datasource",
data_connector_name="default_runtime_data_connector_name",
data_asset_name="asset_1",
runtime_parameters={"query": query1},
batch_identifiers={"default_identifier_name": "default_identifier_1"}
)
# Query 2 RuntimeBatchRequest
batch_request_2 = RuntimeBatchRequest(
datasource_name="my_postgres_datasource",
data_connector_name="default_runtime_data_connector_name",
data_asset_name="asset_2",
runtime_parameters={"query": query2},
batch_identifiers={"default_identifier_name": "default_identifier_2"}
)
***Above is my two runtimebatchrequest to handle two seperate queries
validator_query2 = self.gx_context.get_validator(batch_request = batch2, data_asset_name="asset_2")
validator_query1 = self.gx_context.get_validator(batch_request = batch1, data_asset_name="asset_1")
</code>
<code>context = gx.get_context()
datasource_config = {
"name": "my_postgres_datasource",
"class_name": "Datasource",
"execution_engine": {
"class_name": "SqlAlchemyExecutionEngine",
# Modifying to check the connection string
"connection_string": f"postgresql+psycopg2://{self.db_params['user']}:{self.db_params['password']}@{self.db_params['host']}:{self.db_params['port']}/{self.db_params['database']}",
},
"data_connectors": {
"default_runtime_data_connector_name": {
"class_name": "RuntimeDataConnector",
"batch_identifiers": ["default_identifier_name"],
},
"default_inferred_data_connector_name": {
"class_name": "InferredAssetSqlDataConnector",
"include_schema_name": True,
},
},
}
context.add_datasource(**datasource_config)
datasource = context.get_datasource("my_postgres_datasource")
***Above is my gx_context in which i have initalized my datasource
batch_request_1 = RuntimeBatchRequest(
datasource_name="my_postgres_datasource",
data_connector_name="default_runtime_data_connector_name",
data_asset_name="asset_1",
runtime_parameters={"query": query1},
batch_identifiers={"default_identifier_name": "default_identifier_1"}
)
# Query 2 RuntimeBatchRequest
batch_request_2 = RuntimeBatchRequest(
datasource_name="my_postgres_datasource",
data_connector_name="default_runtime_data_connector_name",
data_asset_name="asset_2",
runtime_parameters={"query": query2},
batch_identifiers={"default_identifier_name": "default_identifier_2"}
)
***Above is my two runtimebatchrequest to handle two seperate queries
validator_query2 = self.gx_context.get_validator(batch_request = batch2, data_asset_name="asset_2")
validator_query1 = self.gx_context.get_validator(batch_request = batch1, data_asset_name="asset_1")
</code>
context = gx.get_context()
datasource_config = {
"name": "my_postgres_datasource",
"class_name": "Datasource",
"execution_engine": {
"class_name": "SqlAlchemyExecutionEngine",
# Modifying to check the connection string
"connection_string": f"postgresql+psycopg2://{self.db_params['user']}:{self.db_params['password']}@{self.db_params['host']}:{self.db_params['port']}/{self.db_params['database']}",
},
"data_connectors": {
"default_runtime_data_connector_name": {
"class_name": "RuntimeDataConnector",
"batch_identifiers": ["default_identifier_name"],
},
"default_inferred_data_connector_name": {
"class_name": "InferredAssetSqlDataConnector",
"include_schema_name": True,
},
},
}
context.add_datasource(**datasource_config)
datasource = context.get_datasource("my_postgres_datasource")
***Above is my gx_context in which i have initalized my datasource
batch_request_1 = RuntimeBatchRequest(
datasource_name="my_postgres_datasource",
data_connector_name="default_runtime_data_connector_name",
data_asset_name="asset_1",
runtime_parameters={"query": query1},
batch_identifiers={"default_identifier_name": "default_identifier_1"}
)
# Query 2 RuntimeBatchRequest
batch_request_2 = RuntimeBatchRequest(
datasource_name="my_postgres_datasource",
data_connector_name="default_runtime_data_connector_name",
data_asset_name="asset_2",
runtime_parameters={"query": query2},
batch_identifiers={"default_identifier_name": "default_identifier_2"}
)
***Above is my two runtimebatchrequest to handle two seperate queries
validator_query2 = self.gx_context.get_validator(batch_request = batch2, data_asset_name="asset_2")
validator_query1 = self.gx_context.get_validator(batch_request = batch1, data_asset_name="asset_1")
***Above are my validators but the issues here is that the last validator_query1 is overriding the validator_query2 and giving me two validator with same query, i am not understanding since a single context can handle multiple batches but here its failing to handle, why? can someone please help me with this
I wanted to get two validator for each query but instead i am getting two validator with same query but the last overrides the first validator
New contributor
Rishab Tomar is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.