We receive daily SQL Server database backups from a third-party vendor that we restore. After the database is online we use bcp to export all of the tables data as .csv files for other applications and vendors to consume this data. The problem I run into is when the database size is small ~ 20 GB it works alright but when the data volume is large > 50 GB everything slows down. These are nightly jobs and sometimes even after 10 hours I see the process in a hanged state with partial data in the storage. Are there any other options to performance optimize powershell script.
Powershell:
$serverName = "myserver"
$databaseName = "mydb"
$outputPath = "\mystoragedata"
Import-Module SqlServer
foreach ($table in (Get-SqlDatabase -ServerInstance $serverName | Where-Object {$_.Name -eq $databaseName} | Select-Object -ExpandProperty Tables)) {
$tableName = $table.Name
$filePath = Join-Path $outputPath "$tableName.csv"
$columnsQuery = "SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = '$tableName'"
$columns = Invoke-Sqlcmd -ServerInstance $serverName -Database $databaseName -Query $columnsQuery -TrustServerCertificate | Out-String
$bcpCommand = "bcp "SELECT * FROM [$databaseName].dbo.[$tableName]" queryout "$filePath" -n -c -t, -F 2 -T -S $serverName"
# Use Start-Process to execute BCP command
Start-Process -FilePath "bcp" -ArgumentList $bcpCommand -Wait
Write-Host "Exported $tableName to $filePath"
}
2