I have a Nest.js (Node.js) application that uploads files to AWS S3. During the upload, I’m using a callback function to track and send live progress updates to the client. Here’s the code for the file upload:
export async function uploadFileAWS(key: string, file, progressCallback: (number) => void) {
const upload = new Upload({
client: s3Client,
params: {
Bucket: process.env.AWS_S3_BUCKET_NAME,
Key: key,
Body: file.buffer,
ContentType: file.mimetype,
},
leavePartsOnError: false,
});
upload.on('httpUploadProgress', (progress) => {
const percentage = Math.round((progress.loaded / progress.total) * 100);
progressCallback(percentage);
});
await upload.done();
}
Locally, everything works fine. When testing with Postman and my frontend client, they receive the response in a live stream format, and the progress of the file upload can be tracked in real-time.
However, after deploying the NestJS application to an AWS EC2 machine, the progress updates stop working as expected. The file is uploaded successfully, but the client only receives all the responses at the end, instead of progressively during the upload.
In the controller handling the request, I’m setting the headers for Server-Sent Events (SSE) like this:
async uploadFile(@Body('type') type: FileType, @UploadedFile() file, @Res() res) {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Connection', 'keep-alive');
res.setHeader('Cache-Control', 'no-cache');
try {
const key = await this.fileUploadService.uploadFile(type, file, (progress: number) => {
res.write(
`data: {"msg":"File Uploading in progress","data":{"progress":${progress}},"success":"true"}nn`
);
res.flush();
});
const url = `https://${process.env.AWS_S3_BUCKET_NAME}.s3.amazonaws.com/${key}`;
res.write(
`data: {"msg":"File Uploaded Successfully", "data":{"key": "${key}", "url": "${url}"},"success":"true"}nn`
);
} catch (error) {
console.error(error);
res.write(`data: {"msg":"Error uploading file","success":"false"}nn`);
} finally {
res.end();
}
}
I’m not sure why this is happening after deployment. Could there be something specific to AWS EC2 or the network configuration that is causing the responses to be buffered and sent all at once instead of in real-time?
Any guidance or suggestions to resolve this issue would be greatly appreciated.
Initially, I suspected that the issue was caused by the NGINX proxy I was using. However, even after temporarily removing the NGINX proxy, the issue persists.
Muhammad Younus Raza is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.