I have a kafka connector JDBC sink with a his own db
{"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
"connection.password":"***",
"connection.user":"postgres",
"topics":"post.public.test",
"name":"kc-sink",
"auto.create":"false",
"connection.url":"***"
"database.hostname":"postgresql-sink",
"auto.create: false"
}
Another kafka connector postgres debezium
{"connector.class":"io.debezium.connector.postgresql.PostgresConnector",
"database.dbname":"postgres",
"database.user":"postgres",
"topic.prefix":"post",
"database.hostname":"postgresql-source",
"database.password":"***",
"name":"kc-source"
}
Separately, the source and the sink work well when the source inserts data into the database, I can read the content in the topic.
And the same for the sink, by adding a record in the producer, I manage to find the row in the database. All with a different topic.
Now, I would like that when I add a row in the database (postgresql-source) and it ends up in the topic, I would like the sink to be able to read the same data in the topic to put it in its own database (postgresql-sink).
unfortunately I got this error message
Caused by: io.confluent.connect.jdbc.sink.TableAlterOrCreateException: Cannot ALTER TABLE “postgresql”.”public”.”test” to add missing field SinkRecordField{schema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, name=’source’, isPrimaryKey=false}, as the field is not optional and does not have a default value
The kafka connect config :
value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: true
I create before the db postgresl-source and postgresl-sink before deployment of kafka-connect and kafka-connector
I want to read the record on topic which was written by kafkaconnector debezium from sink posgresql
locklockls is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.