I am trying to process a large file of around 50 GB, its called master.zip(Its size is around 50 GB) which contains user data
Inside master.zip there are many files named child1.json.gz, child2.json.gz etc
Inside each child file there is a JSON file child1.json
child.json contains multiple minified jsons separated by new line.
Structure of master.zip
master.zip image
Structure of child.json.gz inside master.zip
child.json.gz image
child.json file inside child.json.gz
child.json image
Structure of child.json inside each child.json.gz file
Example
{——JSON Key Value Data ———}
{——JSON Key Value Data ———}
{——JSON Key Value Data ———}
{——JSON Key Value Data ———}
Now my final end goal is to process each JSON in a line one by one, extract one key out of it called UserId and store those UserIds in my dynamoDb.
The main concern here is that i don’t want to load whole 50 Gb file into my memory because it will exhaust my application memory running on ECS Java container.
I tried to process the file chunk wise using **DataBuffer ** but after processing all(4) json lines from child.json of child1.json.gz file and 2 lines from child2.json.gz
i am getting
java.util.zip.ZipException: unexpected EOF.
Response line -----------------> { userId: 1, name : "devil1" }
Response line -----------------> { userId: 2, name : "devil2" }
Response line -----------------> { userId: 3, name : "devil3" }
Response line -----------------> { userId: 4, name : "devil4" }
Response line -----------------> { userId: 5, name : "devil5" }
Processed 4 lines from GZIP.
Response line -----------------> { userId: 6, name : "angel1" }
Response line -----------------> { userId: 7, name : "angel2" }
java.util.zip.ZipException: unexpected EOF
at java.util.zip.ZipInputStream.read(ZipInputStream.java:214)
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:238)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:117)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
MyService Layer method that call the processLargeZip method of LargeZipProcessor class
public final Mono<ServerResponse> handleRequest(final ServerRequest originalRequest) {
return largeZipProcessor.processLargeZipFile()
.map( response -> {
System.out.println("Response line -----------------> " + response);
return response;
}).then(ServerResponse.ok().bodyValue(" Dump Completed"));
}
APPROACH – 1
My code for processing master.zip file using DataBuffer
package com.dynamodb.poc.service;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.stereotype.Service;
import org.springframework.web.reactive.function.client.WebClient;
import reactor.core.publisher.Flux;
import reactor.core.publisher.FluxSink;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.util.zip.GZIPInputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
@Service
public class LargeZipProcessor {
private final WebClient webClient;
@Autowired
public LargeZipProcessor(WebClient.Builder webClientBuilder) {
this.webClient = webClientBuilder
.baseUrl("s3-url")
.build();
}
public Flux<String> processLargeZipFile() {
return webClient.get()
.retrieve()
.bodyToFlux(DataBuffer.class)
.flatMap(this::processZipBuffer);
}
private Flux<String> processZipBuffer(DataBuffer buffer) {
return Flux.create(sink -> {
try (ZipInputStream zipInputStream = new ZipInputStream(buffer.asInputStream(true))) {
ZipEntry entry;
while ((entry = zipInputStream.getNextEntry()) != null) {
String entryName = entry.getName();
if (entryName.endsWith(".gz")) {
// Process each .gz file within the ZIP
try {
processGzipFile(zipInputStream, sink);
} finally {
zipInputStream.closeEntry(); // Ensure the entry is properly closed
}
}
}
sink.complete();
} catch (Exception e) {
sink.error(e);
}
});
}
private void processGzipFile(ZipInputStream zipInputStream, FluxSink<String> sink) {
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(new GZIPInputStream(zipInputStream), StandardCharsets.UTF_8));
String line;
int count = 0;
while ((line = reader.readLine()) != null) { // Getting Error here
count++;
String registration = getUserId(line);
sink.next(registration);
}
System.out.println("Processed " + count + " lines from GZIP.");
} catch (Exception e) {
e.printStackTrace();
sink.error(e);
}
}
private String getUserId(String jsonLine) {
// Write logic to extract userId
return jsonLine;
}
}
Here i am getting
java.util.zip.ZipException: unexpected EOF
at this line in code while ((line = reader.r****eadLine()) != null)
APPROACH – 2
If i change the approach by using byte array class instead of DataBuffer then it works fine and i am able to process a smaller master file called master2.zip (Same structure as of master.zip) which is some KBs in size
package com.dynamodb.poc.service;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.buffer.DataBuffer;
import org.springframework.stereotype.Service;
import org.springframework.web.reactive.function.client.WebClient;
import reactor.core.publisher.Flux;
import reactor.core.publisher.FluxSink;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.util.zip.GZIPInputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
@Service
public class LargeZipProcessor {
private final WebClient webClient;
@Autowired
public LargeZipProcessor(WebClient.Builder webClientBuilder) {
this.webClient = webClientBuilder
.baseUrl("s3-url")
.build();
}
public Flux<String> processLargeZipFile() {
return webClient.get()
.retrieve()
.bodyToMono(byte[].class) // Download the entire file as byte array
.flatMapMany(this::processZipFile); // Process the ZIP once downloaded
}
private Flux<String> processZipFile(byte[] zipFileData) {
return Flux.create(sink -> {
try (ZipInputStream zipInputStream = new ZipInputStream(new ByteArrayInputStream(zipFileData))) {
ZipEntry entry;
while ((entry = zipInputStream.getNextEntry()) != null) {
String entryName = entry.getName();
if (entryName.endsWith(".gz")) {
// Process each .gz file within the ZIP
try {
processGzipFile(zipInputStream, sink);
} finally {
zipInputStream.closeEntry(); // Ensure the entry is properly closed
}
}
}
sink.complete();
} catch (Exception e) {
sink.error(e);
}
});
}
private void processGzipFile(ZipInputStream zipInputStream, FluxSink<String> sink) {
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(new GZIPInputStream(zipInputStream), StandardCharsets.UTF_8));
String line;
int count = 0;
while ((line = reader.readLine()) != null) {
count++;
String registration = getUserId(line);
sink.next(registration);
}
System.out.println("Processed " + count + " lines from GZIP.");
} catch (Exception e) {
e.printStackTrace();
sink.error(e);
}
}
private String getUserId(String jsonLine) {
// Write logic to extract userId
return jsonLine;
}
}
The only disadvantage of this byte[] approach is that i have to load the fine initially in memory. is there a way to avoid it ?
13
You need use another web client. One that capable to giving you InputStream
. Then pass said stream into ZipInputStream
.
Simplest client already in JDK
InputStream stream = new URL("").openStream();
I don’t recommend to use it though.
10
As per suggestion from @talex and @MatthewFlaschen I tried Eclipse Jetty client and it seems to be working
Final Code
package com.example.poc
import org.eclipse.jetty.client.HttpClient;
import org.eclipse.jetty.client.util.InputStreamResponseListener;
import org.eclipse.jetty.util.ssl.SslContextFactory;
import org.springframework.stereotype.Service;
import reactor.core.publisher.Flux;
import reactor.core.publisher.FluxSink;
import reactor.core.publisher.Mono;
import software.amazon.awssdk.core.async.AsyncRequestBody;
import software.amazon.awssdk.services.s3.S3AsyncClient;
import software.amazon.awssdk.services.s3.model.*;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.nio.ByteBuffer;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.zip.GZIPInputStream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;
@Service
public class HttpClientZipProcessor {
private final HttpClient httpClient;
public HttpClientZipProcessor() throws Exception {
SslContextFactory.Client sslContextFactory = new SslContextFactory.Client();
this.httpClient = new HttpClient(sslContextFactory);
}
public Flux<String> processLargeZipFile() {
return Flux.create(sink -> {
InputStreamResponseListener listener = new InputStreamResponseListener();
try {
this.httpClient.start();
List<BufferedReader> readers = new ArrayList<>();
httpClient
.newRequest("s3-url")
.send(listener);
// get response as InputStream
try (InputStream inputStream = listener.getInputStream();
ZipInputStream zipInputStream = new ZipInputStream(inputStream)) {
ZipEntry entry;
while ((entry = zipInputStream.getNextEntry()) != null) {
if (entry.getName().endsWith(".gz")) {
// Process each .gz file within the ZIP
try {
processGzipFile(zipInputStream, sink, readers);
} finally {
zipInputStream.closeEntry(); // Ensure the entry is properly closed
}
}
}
try {
System.out.println("Closing HttpClient");
readers.forEach(reader -> {
try {
reader.close();
} catch (Exception e) {
sink.error(e);
}
});
this.httpClient.stop();
inputStream.close();
zipInputStream.close();
entry = null;
System.gc();
} catch (Exception e) {
sink.error(e);
}
sink.complete();
}
} catch (Exception e) {
sink.error(e);
}
});
}
private void processGzipFile(ZipInputStream zipInputStream, FluxSink<String> sink, List<BufferedReader> readers) throws IOException {
BufferedReader reader = null;
try {
reader = new BufferedReader(new InputStreamReader(new GZIPInputStream(zipInputStream), StandardCharsets.UTF_8));
readers.add(reader);
String line;
while ((line = reader.readLine()) != null) {
sink.next(extractRegistration(line));
}
line = null;
} catch (Exception e) {
sink.error(e);
}
}
private String extractRegistration(String jsonLine) {
return jsonLine; // Extract registration field from the JSON line
}
}
0