Lense s3 connector sinking missing metadata information

Hi,

I am testing to use lense s3 sink connector in for backing up AWS MSK using AWS MSK Connect. I am able to backup the kafka topics to s3, however they are missing the metadata information in the s3 file. Here are my configurations below.

connector.class=io.lenses.streamreactor.connect.aws.s3.sink.S3SinkConnector
tasks.max=2
topics=sampletopic
connect.s3.aws.auth.mode=Default
schema.enable=false
security.protocol=SASL_SSL
connect.s3.kcql=INSERT INTO `test-msk-backup:backup` SELECT * FROM `sampletopic` STOREAS `JSON` WITH_FLUSH_COUNT = 1 WITH_FLUSH_SIZE = 10000 WITH_FLUSH_INTERVAL = 60 PROPERTIES('store.envelope'=true)
sasl.mechanism=AWS_MSK_IAM
sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;
value.converter.schemas.enable=false
sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
connect.s3.aws.region=ap-southeast-1
value.converter=org.apache.kafka.connect.json.JsonConverter
errors.log.enable=true
key.converter=org.apache.kafka.connect.storage.StringConverter

I am using Stream Reactor 6.0.1. Please help to advise what is wrong with my configuration

I have found out that mentioning the specific topic as below will output the metadata information

connect.s3.kcql=INSERT INTO `test-msk-backup:backup` SELECT * FROM `sampletopic` STOREAS `JSON` WITH_FLUSH_INTERVAL = 60 PROPERTIES('store.envelope'=true)

however, if I am using FROM *, it won’t include the metadata. Is this a bug?

connect.s3.kcql=INSERT INTO `test-msk-backup:backup` SELECT * FROM `*` STOREAS `JSON` WITH_FLUSH_INTERVAL = 60 PROPERTIES('store.envelope'=true)

Hi Rudy,

Indeed, as per the docs the ‘store.envelope’ backs up the entire Kafka message.

When it comes to storing with envelope mode, at the moment FROM *

connect.s3.kcql=INSERT INTO `test-msk-backup:backup` SELECT * FROM `*` STOREAS `JSON` WITH_FLUSH_INTERVAL = 60 PROPERTIES('store.envelope'=true)

will default to non-envelope. It’s a feature we haven’t yet added.

Stefan