I have a bash script that is interacting with a legacy system that converts between file format (e.g. pptx to pdf).
The remote system takes a json object with pptx base64-encoded and returns another json with base64-encoded.
I am trying to decide whether it is a better idea to
- call the curl, pipe it to jq, pipe it to base64 decoded and save it
#!/bin/bash
(echo -n '{"key": "value", "binary": "'; base64 in.pptx; echo '"}') |
curl -s --location "url" --header 'Content-Type: application/json' --data @- |
jq -r '.result' |
base64 -d > out.pdf
Personally, I like the tidyness of approach 1 and it involves no disk write and two processes can run concurrently. It is running okay with several test files. However, I am concerned that it might encountered error like (23) Failed writing body”? with some other files.
- If option 1 wouldn’t work as expected for some cases, I would probably try something like calling the curl with -o option to save it to a bwrap tmp file, jq the file, pipe it to base64 decode and save it again
#!/bin/bash
bwrap --dev-bind / / --unshare-net (echo -n '{"key": "value", "binary": "'; base64 in.pptx; echo '"}') |
curl -s --location "url" --header 'Content-Type: application/json' --data @- -o /tmp/encoded
jq -r '.result' /tmp/encoded |
base64 -d > out.pdf