I have a large dataset containing over 10 million JSON records. Could someone please advise on the most efficient method to insert these into a string column table? Currently, I am using clickhouse_driver, but it takes approximately one minute to load 100,000 records.
current steps:
- read data from db (each row as json)
- load into dataframe
- use insert_dataframe or execute method to insert into clickhouse table (target will just have one column with json rows)