I’m trying to upload large files to S3 like this:
https://www.baeldung.com/aws-s3-multipart-upload
// Prepare the parts to be uploaded
List<CompletedPart> completedParts = new ArrayList<>();
int partNumber = 1;
ByteBuffer buffer = ByteBuffer.allocate(5 * 1024 * 1024); // Set your preferred part size (5 MB in this example)
// Read the file and upload each part
try (RandomAccessFile file = new RandomAccessFile(filePath, "r")) {
long fileSize = file.length();
long position = 0;
while (position < fileSize) {
file.seek(position);
int bytesRead = file.getChannel().read(buffer);
buffer.flip();
UploadPartRequest uploadPartRequest = UploadPartRequest.builder()
.bucket(existingBucketName)
.key(keyName)
.uploadId(uploadId)
.partNumber(partNumber)
.contentLength((long) bytesRead)
.build();
UploadPartResponse response = s3.uploadPart(uploadPartRequest, RequestBody.fromByteBuffer(buffer));
Is it possible to also have a Spring REST API receive those chunks from a client? E.g. the client would send chunk 1 5MB then chunk 2 5MB but I would start uploading chunk 1 to S3 before chunk 2 is received.
How would that client be configured? It says here that RestTemplate can be used for Multipart file requests for this purpose:
Multipart file requests break a large file into smaller chunks and use
boundary markers to indicate the start and end of the block.
I assume I can use a multipart Spring API like this if the client is configured correctly?
https://spring.io/guides/gs/uploading-files