Location of stored data in S3 with the Lenses connector

My sink connector with the following configuration started successfully. The SinkRecordSendRate and SinkRecordReadRate metrics showed data was flowing through the connector to the s3 destination, but when I checked S3 I did not see the data. I expected to see a folder named after the topic being backed up.

connector.class = io.lenses.streamreactor.connect.aws.s3.sink.S3SinkConnector
s3.region = us-east-1
key.converter.schemas.enable = false
connect.s3.kcql = INSERT INTO $BUCKET_NAME SELECT * FROM lensestestTopic STOREAS JSON
connect.s3.aws.region = us-east-1
schema.compatibility = NONE
tasks.max = 2
topics = lensestestTopic
schema.enabled = false
errors.log.enable = true
errors.tolerance = “all”
errors.log.include.messages = true
config.action.reload = “restart”
errors.deadletterqueue.topic.name = “dlq_file_sink”
errors.deadletterqueue.topic.replication.factor = “1”
value.converter = org.apache.kafka.connect.storage.StringConverter
key.converter = org.apache.kafka.connect.storage.StringConverter"

The service execution role had these permissions.

    {
        "Action": [
            "s3:ListBucket",
            "s3:GetBucketLocation",
            "s3:DeleteObject",
            "s3:PutObject",
            "s3:GetObject",
            "s3:AbortMultipartUpload",
            "s3:ListMultipartUploadParts",
            "s3:ListBucketMultipartUploads"
        ],
        "Effect": "Allow",
        "Resource": [
            "arn:aws:s3:::*",
            "arn:aws:s3:::*/*"
         ]

I did see the three .indexes objects in s3 for the connector, but that was all.

Was an additional configuration required for data to be placed in the folder mentioned above?

Where did the data get stored in s3? There is no sign of any new data being written to the bucket.

Thanks in advance.

I checked the s3 bucket today and saw the data. Not sure why it took so long to see the results since there were only 260 lines of text in that particular topic.

Hello, your config is incorrect, this is why you have this beahvior, please refer to Kafka to AWS S3 open source connector | Lenses.io Documentation for examples.
for example your topic name shouldn’t be $BUCKET_NAME, or another example is you should enable the converter, since you have txt data to convert to json, etc…
I hope this helps

a working config example on my lab is

connector.class=io.lenses.streamreactor.connect.aws.s3.sink.S3SinkConnector
connect.s3.kcql=INSERT INTO mybucket:myprefix SELECT * FROM mytopic STOREAS JSON WITH_FLUSH_SIZE = 1000000 WITH_FLUSH_INTERVAL = 3600 PROPERTIES (‘store.envelope’ = true)
topics=mytopic
tasks.max=1
name=demo_sink_s3_toJSON
value.converter=org.apache.kafka.connect.storage.StringConverter
key.converter=org.apache.kafka.connect.storage.StringConverter