I am creating a script that creates a large number of random generated 10 character long Codes e.g. 5 million Codes. I insert my created Codes in batches of 10000 into a database to not run into trouble with memory.
Now i want to write all these 5 million Codes into a csv file to download but my memory gets exceeded when i try it
//batchsize of 10000
$size = 10000;
$group = [];
// generate 5 million Codes
for ($i = 1; $i <= 5000000; $i++) {
$randomString = generateUniqueString($existingStrings);
$group[] = $randomString;
if (count($group) >= $size) {
$db->beginTransaction();
$db->insertCodes($group);
$db->commitTransaction();
$group = [];
}
}
// Insert remaining codes in the last group
if (!empty($group)) {
$db->beginTransaction();
$db->insertCodes($group);
$db->commitTransaction();
}
$handle = fopen($file, 'w');
// the problem probably occurs around here when everything gets stored in $result
$result = $db->getAllCodes();
// Write codes to the CSV file
while ($row = mysqli_fetch_assoc($result)) {
fputcsv($handle, [$row['Code']]);
}
fclose($handle);
$db->close();
Do you have any idea how i am able to also insert them in batches without exceeding my memory maximum? I tried a few methods all not working and i am stuck at this point of my Code