Failed to deserialize data for topic Failed to deserialize data for topic docker docker

Failed to deserialize data for topic


Issue fixed.

Basically kafka message was successfully persisted to the topic but when my JDBC sink connector was trying to parse it and copy to our MySQL DB it wasn't able to connect to the schema registry URL.

Previous connector config:

{  "name": "zuum-sink-positions",  "key.converter.schema.registry.url": "http://localhost:8081",   "value.converter.schema.registry.url": "http://localhost:8081",   "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",  "key.converter": "org.apache.kafka.connect.storage.StringConverter",  "value.converter": "io.confluent.connect.avro.AvroConverter",  "key.converter.schemas.enable":"false",  "value.converter.schemas.enable":"true",  "config.action.reload": "restart",  "errors.log.enable": "true",  "errors.log.include.messages": "true",  "print.key": "true",  "errors.tolerance": "all",  "topics": "zuum-positions",  "connection.url": "jdbc:mysql://ip:3306/zuum_tracking",  "connection.user": "user",  "connection.password": "password",  "auto.create": "true"}

Updated schema registry url with correct host:

{  "name": "zuum-sink-positions",  "key.converter.schema.registry.url": "http://schema-registry:8081",   "value.converter.schema.registry.url": "http://schema-registry:8081",   "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",  "key.converter": "org.apache.kafka.connect.storage.StringConverter",  "value.converter": "io.confluent.connect.avro.AvroConverter",  "key.converter.schemas.enable":"false",  "value.converter.schemas.enable":"true",  "config.action.reload": "restart",  "errors.log.enable": "true",  "errors.log.include.messages": "true",  "print.key": "true",  "errors.tolerance": "all",  "topics": "zuum-positions",  "connection.url": "jdbc:mysql://ip:3306/zuum_tracking",  "connection.user": "user",  "connection.password": "password",  "auto.create": "true"}


Assuming the issue was the key of the message, the alternative answer would be to change the Connect container env vars to change

CONNECT_KEY_CONVERTER=io.confluent.connect.avro.AvroConverterCONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL=http://schema-registry:8081

Otherwise, if the issue was only the value, then you only need to specify the converter settings within the JSON if you want to override the default values set in the Compose / connect-distributed.properties file. In other words, you could have removed the localhost values completely.