My office primarily uses R for data analysis, although I use python. A handful of people use excel. Our office uses a single data pull each day to do all of our analyses so if we get different answers, we know it’s not because of use hitting the databases at different times. The major problem with this is we save that data pull as an .RData file, that can only be opened in R.
I’m working on converting that data file into something more universal such as a json file. That way python can open it easily, and I believe excel can open it if you use powerquery or something (I don’t use excel much for analysis). This would make it easily accessible to everyone in our office without using multiple programs each time we do something.
The problem I’m running into is that I can’t find a good way to access each individual item saved in the .rdata file and save it off as a json file. It looks like there are 56 dataframes and variables saved in the .rdata file, and I want to convert each of the dataframes to a json file.
I’m open to other formats if someone has a good suggestion, but for context, our IT department is completely incompetent so we can’t get software upgrades, and we can’t load python and R packages ourselves. Our installed R libraries can’t handle hdf5 or parquet, and our installed python packages can’t handle parquet. I also don’t want to completely screw the excel people, although they’ve by necessity learned some R, so it doesn’t need to be usable in excel.
I’ve tried
test_import = load(my_data)
print(test_import)
But so far I’ve just been able to get lists of the items that are in the file.