I use Moodle 4.1
Need to run script with php cli which fills csv file with nearly million rows of data.
<?php
define('CLI_SCRIPT', true);
require "config.php";
use core_reportbuildertablecustom_report_table_view;
$table = custom_report_table_view::create(264);
$table->setup();
$table->download = 'excel';
$columns = (new table_dataformat_export_format($table, $table->download))->format_data($table->headers);
[$sql, $params] = $table->getPaginatedDataSQL();
$filePath = '/var/excelreport/tmp_264_' . date("d-m-Y_H:i:s") . '.csv';
try {
$file = fopen($filePath, 'w');
fprintf($file, chr(0xEF).chr(0xBB).chr(0xBF));
echo "File '$filePath' successfully created.";
} catch (Throwable $throwable) {
echo $throwable->getMessage();
echo "Failed to create '$filePath'.";
}
$limit = 10000;
$rowCount = $table->getRowsCount();
fputcsv($file, $columns,';');
for ($offset = 0; $offset < $rowCount; $offset += $limit) {
$chunk = $DB->get_records_sql($sql. " LIMIT $limit OFFSET $offset", $params); // here $DB is global variable which comes from config.php
foreach ($chunk as $row) {
$row = array_values($table->format_row($row));
fputcsv($file, $row,';');
}
echo "n offset $offset memory: ". (memory_get_usage() / 1024)."KBn" ;
unset($chunk);
gc_collect_cycles();
}
$table->close_recordset();
fclose($file);
Simply I fetch data chunks with size of 10000 rows from moodle’s db and append to file in each iteration with fputcsv. After each iteration I echo memory usage which increases as script runs.
enter image description here
Did I miss something to free or php somehow increases its memory footprint in long running tasks? AS a solution should I execute each chunk as seperate process with exec?
Kolt Stevenson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.