Error "Bytes must be provided" on S3 sink

Hello everyone!
I’m trying to use the S3 Sink connector and after several attempts I got it running. The problem is that a few moments after it started, I see this error in the logs:

Encountered error [requirement failed: Bytes must be provided]

My guess is that a tombstone was found and the connector is not gracefully treating it.
How can I get some help understanding what is causing the issue?

Connector version: (https://github.com/lensesio/stream-reactor/releases/download/7.3.2/kafka-connect-aws-s3-7.3.2.zip

Connector configuration:

connector.class=io.lenses.streamreactor.connect.aws.s3.sink.S3SinkConnector
connect.s3.kcql=INSERT INTO aBucket:aPrefix SELECT * FROM `*` STOREAS BYTES PROPERTIES ('flush.count'=1)
topics=list,of,topics
tasks.max=5
connect.s3.seek.max.files=20
name=msk-s3-sink
connect.s3.aws.region=us-east-1 value.converter=org.apache.kafka.connect.converters.ByteArrayConverter errors.log.enable=true key.converter=org.apache.kafka.connect.converters.ByteArrayConverter

Hi mauro,

Indeed there is a gap in the sink when handling tombstones. It will be addressed in the near future.

For now the best option is to filter the records out via an SMT. Here is the configuration:

"transforms"                          : "dropNullRecords",
"transforms.dropNullRecords.type"     : "org.apache.kafka.connect.transforms.Filter",
"transforms.dropNullRecords.predicate": "isNullRecord",

"predicates"                          : "isNullRecord",
"predicates.isNullRecord.type"        : "org.apache.kafka.connect.transforms.predicates.RecordIsTombstone"