I am trying to read and process the bigquery table using the polars. At the end I want to write the each row as a JSON in the file such that all the fields that have null values in it (no matter how nested the JSON is) is excluded.
I am using polars to process the bigquery row. I am new to polars and could not achieve the result.
My code looks like below
def clean_json(data):
return json.dumps(clean_record_df(data), separators=(',', ':'))
def remove_null(data):
"""
Recursively remove fields with null values.
"""
if isinstance(data, dict):
return {k: clean_record_df(v) for k, v in data.items() if v is not None}
elif isinstance(data, list) or isinstance(data, pl.Series):
return [clean_record_df(item) for item in data if item is not None]
return data
def process_row():
table = pa.Table.from_batches(lazy_frames)
df = pl.from_arrow(table)
df = df.with_columns(
pl.col("event_params").map_elements(clean_json, return_dtype=pl.Utf8).str.json_decode()
)
file = f"file.json"
df.write_ndjson(file)
query = f"select * from {self.table_name}"
query_job = self.bq_client.query(query)
arrow_iterable = query_job.result().to_arrow_iterable(bqstorage_client=self.bq_storage_client)
arrows = []
for arrow in arrow_iterable:
arrows.append(arrow)
.....
if num_of_rows >= 50000
process_row(arrows)
And the final results looks like
{"event_date":"20240324","event_timestamp":1711213604759724,"event_name":"event_name","event_params":[{"key":"ignore_referrer","value":{"string_value":"true","int_value":null}}.......
As you can see the event_params
still have int_value
fields even though the field value is null.
And I think this is expected because as soon as I do .str.json_decode()
it converts to Struct and when the field is not present in one row but its present in another row, the struct takes consideration of that as well when determining the final type.
So, I would like to know how can I process the row in polar such that at the end I would get at the JSON line that don’t have fields with null values no matter nested it the json value is.