I have a kafka topic with 20 partitions and I have a spring cloud stream based java program which takes input and produce output messages to the configured kafka topics.
As of now I am running only one instance of the program so same input group is consuming from all 20 partitions.
To improve efficiency, is it right to run 4 instances of the program to consume from 5 partitions each by configuring the below properties unique per instance.
spring.cloud.stream.bindings.input.group=my-consumer-group
spring.cloud.stream.bindings.input.consumer.concurrency=5
spring.cloud.stream.kafka.binder.consumer.client.id=my-client-id (varying based on instance)