Relative Content

Tag Archive for apache-flinkflink-streaming

Flink Job Not Consuming All Kafka Partitions with Checkpoint/Savepoint on 1.19.0 and 1.19.1

I’m encountering an issue where my Flink job, deployed on Flink Clusters 1.19.0 and 1.19.1, fails to consume from all partitions of a Kafka topic when deployed with a checkpoint or savepoint. Despite the Flink 1.19 documentation not listing an official Kafka connector, I’m using one from Maven, which is supposed to support 1.19 bacause it’s for the 1.19 version.

How to Persist Offset for Custom Source with Flink Datasource API?

I’m developing a custom source for Apache Flink 1.18 and need to ensure that the offset for each partition is correctly persisted and restored between pipeline restarts. I can’t store the offset in the data source system, so it needs to be correctly checkpointed within Flink. I’m confused about the separation between SourceSplit and SplitState.

How does the custom partition in flink work?

I’m currently working on a custom partitioner but I’m encountering some issues with my code. I’ve tried looking for documentation on the custom partitioner but have been unable to do so. If anyone has the documentation I would be extremely grateful 🙂