As the question says I am trying to save a large matrix to an SQLite database in Julia. The matrix has 500 rows and 1000 columns.
When I try to save the matrix it hangs up and doesn’t finish. However, I realized that I can easily save a matrix of dimensions 50000X10, which would have the same number of cells.
- Is there anyway to more optimally save a matrix with many columns to an SQLite database?
I could just save the relevant CartesianIndices, but I would rather save a matrix.
- Why does the number of columns affect run time so much more than the number of rows? Does this have something to do with the way SQLite works?
Here is some sample code that illustrates how much slower having extra columns is:
#test sql set
using SQLite
using DataFrames
using Tables
db1 = SQLite.DB()#to memory
@time SQLite.load!(Tables.table(zeros(5000, 10)), db1)
@time SQLite.load!(Tables.table(zeros(500, 100)), db1)
OUTPUT:
1.237796 seconds (2.10 M allocations: 128.612 MiB, 3.90% gc time, 97.92% compilation time)
6.039620 seconds (1.29 M allocations: 80.599 MiB, 0.25% gc time, 99.86% compilation time)
We can see that the difference between 5000×10 and 500×100 is that the latter case is about 5x slower.
Thank you for your help.