In our system, we are trying to reduce the data transfer duration between two applications through RabbitMQ. To do so, we are doing some tests with Protobuff and Flatbuffer. According to our test, these both two tools are slower than simple JSON.stringify()
in NodeJS.
Protobuff using protobufjs
library almost doubled the time compared to JSON every time in NodeJS. Here is an example result:
Total time for encoding 1000 messages with proto: 42ms
Total time for encoding 1000 messages with normal buffer: 26ms
Here is the code we’ve ran:
// load proto schemas and read JSON from a file
// ...
const payload = JSON.parse(stringifiedData)
const MESSAGE_COUNT = 1000
let TOTAL_TIME_FOR_PROTO = 0
for (let i = 0; i < MESSAGE_COUNT; i++) {
const start = Date.now();
const err = MyMessage.verify(payload);
if (err) {
throw err;
}
const protoBuffered = MyMessage.encode(payload).finish();
const end = Date.now();
// For simplicity, replace the below line for the JSON test.
// const jsonBuffer = Buffer.from(JSON.stringify(payload))
const time = end - start;
TOTAL_TIME_FOR_PROTO += time
sendMessageToProtoBuffered(protoBuffered);
}
console.log(`Total time for encoding ${MESSAGE_COUNT} messages with proto: ${TOTAL_TIME_FOR_PROTO}ms`);
However, we performed the same test in Golang, and both tools extremely reduced the encoding/decoding time. Here is an example test result:
Sent 1000 messages.
Total time for encoding with proto buffer: 5 ms
Total time for encoding with normal buffer: 13 ms
CONSUMED Proto 1000 messages
CONSUMED Normal 1000 messages
Total time for proto consuming: 1.808826ms
Total time for normal consuming: 10.963076ms
Why does Protocol Buffers significantly slow down the process in Node.js, while it performs the opposite in other languages like Go and Java? Is there a way to fix this issue?
If you’d like further clarification or an expanded explanation, let me know.