Using JSON Schema and passing datetime fields to Kafka Connect

Has anyone any pointers on using JSON Schema and passing through date/datetime fields to Kafka Connect to build the columns as a datetime and not a string?

What is generating the json schema based messages? Are you using the jdbc sink connector, whats the destination db type?

We have a custom python producer building the schema from a description. this is an example datetime field…

The message is just standard json…


What connector are you using?

Jdbc sink going to a mysql DB

Going to see what a source connector does with json schema, maybe that will help point in the right direction

Ok, many thanks for your help…

The jdbc source connector creates a schema like this

  "type" : "object",
  "title" : "outbox",
  "properties" : {
    "date_added" : {
      "connect.index" : 1,
      "oneOf" : [ {
        "type" : "null"
      }, {
        "type" : "integer",
        "title" : "",
        "connect.version" : 1,
        "connect.type" : "int64"
      } ]
    "id" : {
      "type" : "string",
      "connect.index" : 0

Which looks like it uses a millisecond epoch (ex: 1622737247818)

If you don’t want to encode kafka specific stuff in your json schema i think the next best option would be to use an SMT, i think you can use the TimestampConverter

SMT is probably the preferred method

Something like this

"transforms.TimestampConverter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.TimestampConverter.format": "yyyy-MM-dd'T'HH:mm:ss.SSSz",
"": "Timestamp",
"transforms.TimestampConverter.field": "SystemModstamp"```

Thanks. Luckily all the tables have system fields so I can use this on those, but I was hoping not to have to have a custom option for the non-system Fields.