I’m developing an application that consumes result of command ssh -p29418 [email protected] gerrit stream-events -s comment-added
.
This command output events as json, one event per line.
{"type":"comment-added",change:{"project":"tools/gerrit", ...}, ...}
Here’s an example code of my application.
// A ssh client library
Jsch client = new JSch();
Session session = client.getSession("username", "host", 22);
session.setPassword("password");
session.connect();
ChannelExec channelExec = (ChannelExec) session.openChannel("channelExec");
channelExec.setCommand("gerrit stream-events -s comment-added");
channelExec.connect();
InputStreamReader reader = new InputStreamReader(channelExec.getInputStream());
BufferedReader bufferedReader = new BufferedReader(reader);
String event;
while ((event = bufferedReader.readLine()) != null) {
handleEvent(event);
}
As a constract, i started two instance of this application (first instance handleEvent method return in 1s, second instance handleEvent return in 20s). Imaging that after a long time running, the first instance recorded all events sent by server, but the second instance lost event randomly, is this possible?
- What will happen to the event producer when consumer consume events (the second app) slower than producer (gerrit stream-events)?
- I’ve read some pages explaining that the socket sender cannot continue send bytes until the socket receiver’s recv_buf has enough spaces. Does this limitation will block process output message to stdout?
- What’s the difference between socket write_buf and java program System.out’s write_buf? What will happen if java program stdout be redirected to socket output but sender write_buf and receiver recv_buf is full?
- How does ssh server-side redirect stdout of command
gerrit stream-events
and deliver to client-side.