When I executed the SSIS package, I received the following error:
The buffer manager has allocated 104,857,600 bytes, even though memory pressure has been detected and repeated attempts to swap buffers have failed.
Do you have any suggestions, please?
I was thinking of increasing the RAM for the production live server, but I cannot restart the live server now. We have server patching scheduled for the 18th of July, so after the patch, I will increase the RAM and restart the server. Meanwhile, I would like to know the resolution.
Mohini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
2
This error is related to how SSIS manages memory for data flows. The default max buffer size is 10,485,760 Bytes (10 MB). Your message indicates that this value was increased to the maximum of 100 MB or AutoAdjustBufferSize
was set to True
.
Using AutoAdjust is the way to go, because then you do not need to fuss with DefaultMaxBufferRows
and DefaultBufferSize
to optimize memory utilization and it will adjust from how it runs on your machine vs. other servers. Each environment will usually have different amounts of memory available.
Some other points to consider:
- A data flow task will use a maximum of 5 Buffers at a time. Maxing memory out at 500 MB per data flow. This memory needs to be allocated across threads.
- Within a data flow, the
EngineThreads
property will suggest to SSIS how much parallelism to use across sources, transformations and destinations. Because this does not allow direct control over how things will get executed, it is much more effective to manage parallelism through data flow design. That is, instead of having 10 sources and destinations in one data flow task, it is better to separate these each into their own data flow. Also, less complicated data flows will use memory more efficiently. Consider doing work in the database, rather than in SSIS. i.e. Stage data and update a target table in the database, rather than using sorts and lookups and OLE DB destinations. - The number of data flows that can run at once is managed by a control flow property called
MaxConcurrentExecutables
. By default this is set to -1 which means the number of processors plus 2. This would be another way to limit parallel execution so that you do not have too many data flows firing at once. However, it would be more strategic to separate executable tasks into containers and tie the containers together with precedence constraints.
Now that you know how memory is allocated, you can check on your server and see if it, indeed, needs more memory both to support the OS and SSIS. This would be a first step, in case it is terribly low. The second step would be to review the design considerations above and optimize memory use in your packages. This should be done to insure performance over time and to establish a design strategy for new packages.