CLI for producing Avro message to kafka topic

Hi all
Is there any other CLI tools out there to produce Avro message to kafka topic with schema registry based encoding , except java based avro-console-producer ?
Maybe a few tools that can be pipelined together to achieve the same.

Not that I know of, not sure what you want to do, but you might leverage https://crates.io/crates/schema_registry_converter.

I want to post to avro topic with schema registry without java and without writing a program in python / node / java
Just cli that i can run from shell script

But how do you get the types in shell? Easy enough to create something for a specific schema. You don’t mention not writing Rust, but I guess that’s of the table as well?

  1. What do you mean with the sentence types in shell
  2. I try to avoid rust as well as any real programming language, and do it in bash and without java

In fact my main intention was to get feedback about existing CLI tools
Because i cant find anything appropriate during last few hrs of search
Either i need to jnstall JRE and use confluent shell script wrappers or write some script in python or node to produce schema registry aware avro messages

There are some standalone tools (kcat, kaf) that allow to read from such topics, but no tools to produce

I tend to write bash scripts — this is what I currently have - not ideal for a general usecase, but works well for me

./avro_publisher.sh foo '{"CUSTOMER_ID":"1","ORDER_ID":"A"}' '{"price": { "string" : "Z"}, "qty": {"int": 3}}'


TOPIC=$1
shift

KEY=$1
shift

VALUE=$1
shift

key_schema=$(cat <<EOF
{
  "type": "record",
  "name": "order_key",
  "namespace": "dev.buesing",
  "fields": [
    {
      "name": "CUSTOMER_ID",
      "type": {
        "type": "string"
      }
    },
    {
      "name": "ORDER_ID",
      "type": {
        "type": "string"
      }
    }
  ]
}
EOF
)

value_schema=$(cat <<EOF
{
  "type": "record",
  "name": "order_value",
  "namespace": "dev.buesing",
  "fields": [
    {
      "name": "price",
      "type": [ "null", "string" ],
      "default" : null
    },
    {
      "name": "qty",
      "type": [ "null", "int" ],
      "default" : null
    }
  ]
}
EOF
)

echo "$KEY|$VALUE" |
kafka-avro-console-producer \
	--broker-list localhost:19092 \
	--property schema.registry.url="[http://localhost:8081](http://localhost:8081)" \
	--topic ${TOPIC} \
	--property value.schema="${value_schema}" \
	--property key.schema="${key_schema}" \
        --property parse.key=true \
        --property key.separator=\|```

The script uses confluent shell wrappers over java classes that i try to avoid using, as i wrote several times in the thread.

Also, it seems you may avoid passing the whole schema text to the tool, by passing just an id from schema registry, as described here
https://stackoverflow.com/questions/59582230/use-kafka-avro-console-producer-with-a-schema-already-in-the-schema-registry

Yes if you know the id, you don’t need the schema to send along, but you still need the schema to get the bytes right. But it’s next to impossible to get the bytes right without using anything from https://github.com/apache/avro/tree/master/lang. So indeed I think you need to pick you favorite from there, and build something yourself. If you don’t want to use the Java one.

  • trying to avoid “shell wrappers over java classes” — I thought you were trying to avoid writing your own java. All of the Apache Kafka commands are java classes with shell wrappers as well (not just Confluent’s tools). Do you have a requirement were you need to run this w/out a JRE? kafka cat is based on librdkafka so there are possiblities of using that to publish avro, but that gets quite tricky.