Hello with following code:
private void CompareFiles(string localPath, string remotePath, in ConcurrentBag<FileInfo> filesToUpdate, string partPath)
{
var localFiles = new Dictionary<string, FileInfo>(StringComparer.OrdinalIgnoreCase);
using (var localStream = new StreamReader(localPath, Encoding.UTF8))
using (var remoteStream = new StreamReader(remotePath, Encoding.UTF8))
{
string? localLine;
while ((localLine = localStream.ReadLine()) != null)
{
var localFileInfo = ParseInfoLine(localLine.AsSpan());
if (localFileInfo.HasValue)
{
localFiles[localFileInfo.Value.Path] = localFileInfo.Value;
localLine = null;
localFileInfo = null;
}
}
string? remoteLine;
while ((remoteLine = remoteStream.ReadLine()) != null)
{
var remoteFileInfo = ParseInfoLine(remoteLine.AsSpan());
if (remoteFileInfo.HasValue)
{
if (!localFiles.TryGetValue(remoteFileInfo.Value.Path, out var localFileInfo) ||
localFileInfo.Checksum != remoteFileInfo.Value.Checksum)
{
filesToUpdate.Add(AddPartToFileInfo(remoteFileInfo.Value, in partPath));
remoteLine = null;
remoteFileInfo = null;
}
}
}
localStream.Close();
remoteStream.Close();
}
//GC.Collect();
//GC.WaitForPendingFinalizers();
}
I’m wondering if calling GC.Collect()
after finish is good option, because I’m comparing many files with contents about 1 million lines, and I know that the “parsed” data is no longer needed after, because I create from it other type of objects.
Without the GC
, there is a problem with RAM grow during the action…
What do you think about that?
What I can do instead of that?
During the debugging GC
is quite lazy and it waits until the app memory usage grows and grows… It’s caused of course because of the line reading and parsing.
4