I’m looking for a decentralized way to exchange messages (JSON) between two linux servers. The difficult thing is making sure the messages arrive in case of an outage. Its crucial, that no message gets lost and that connection is reastablished after an outage. So basically both servers need some kind of queue, which is saved on the hard drive. Additionaly, servers need to send ACK messages after a message arrives. Only after it’s certain, that the message arrived, can it be deleted.
Worse case scenario: Server 1 looses connection while it’s sending messages and gets rebooted before messeges arrive on server 2.
I tried a MariaDB master-master replication on a table that was storing the messages, which works really well until one of the servers restarts, in which case the binlog resets and replication breaks and needs to be fixed manually. I was running MariaDB as docker containers.
I looked into RabbitMQ which would work in theory, but it needs a server (broker) and if it fails, no messages can be send or recieved and also there is no queue in case of a server outage.
SocketIO would be good if it saved the undelivered messages on the hard drive.
Ill probably just extend on one of these three solutions, but before I start coding I thought I would check if anybody knows a project like that.
2