I have been experimenting with creating an in-memory SQLite database in R. I have written a program that suggests that generating an in-memory database, and populating it with a table containing a million real numbers, has ZERO effect on the amount of memory consumed by R. In turn, this leads me to suspect that the so-called “in-memory” database is bogus and is really stored on disk. I attach the program that leads me to this conclusion.
On my machine, after I’ve created the in-memory database and populated it with a large table, and after I’ve deleted the matrix (mat) used to build the table, the amount of memory consumed is identical to the memory consumed at the beginning of the program, i.e., the in-memory database is consuming ZERO memory.
Am I right, or am I missing something?
Operating system: Windows 11
R version 4.3.1 Patched (2023-10-10 r85312 ucrt) — “Beagle Scouts”
Here is the complete program:
library(DBI)
library(RSQLite)
sqlite_in_memory0 = function() {
#This function measures memory usage via gc.
mem_report = function() {
mem_used = gc()
sum(mem_used[, "(Mb)"])
}
message('Memory: Before creating dfr: ', mem_report())
#Generate a data frame.
mat = matrix(rnorm(1e8), ncol=10)
dfr = as.data.frame(mat)
rm(mat)
gc()
colnames(dfr) = letters[1:10]
#Check size of dfr.
print(pst('Size of dfr: ', format(object.size(dfr), units='Gb')))
message('Memory: After creating dfr: ', mem_report())
#Connect to an in-memory database.
conx <- dbConnect(RSQLite::SQLite(), ":memory:")
#Disconnect in-memory database on exit.
on.exit(dbDisconnect(conx))
#Insert data.
dbWriteTable(conx, 'dfr', dfr)
message('nMemory: After creating in-memory table: ', mem_report())
#Drop dfr. What is the effect on memory?
rm(dfr)
message('Memory: After rm(dfr): ', mem_report())
#Retrieve data from in-memory database.
result <- dbReadTable(conx, 'dfr')
message('dim(result):')
print(dim(result))
message('Memory: After dbReadTable: ', mem_report())
}
sqlite_in_memory0()
5