Tolerance exceeded in error / Invalid schema type for ByteArrayConverter error with S3 Connector for restoring a topic

Trying to configure a restore process using Lenses 5.3 and the Lenses S3 Connector and getting an error:

org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:223)
	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:149)
	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.convertTransformedRecord(AbstractWorkerSourceTask.java:474)
	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.sendRecords(AbstractWorkerSourceTask.java:387)
	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:354)
	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)
	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:244)
	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:72)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.kafka.connect.errors.DataException: Invalid schema type for ByteArrayConverter: STRUCT
	at org.apache.kafka.connect.converters.ByteArrayConverter.fromConnectData(ByteArrayConverter.java:55)
	at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:64)
	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.lambda$convertTransformedRecord$5(AbstractWorkerSourceTask.java:474)
	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:173)
	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:207)
	... 12 more

The Connector config is set to:

connector.class=io.lenses.streamreactor.connect.aws.s3.source.S3SourceConnector
connect.s3.kcql=INSERT INTO `adamTest2` SELECT * FROM `shippositions:adamTest` STOREAS `AVRO` PROPERTIES ('store.envelope' = true)
tasks.max=1
name=adamTest2-R-2023-11-09-22-02-47-783
connect.s3.partition.search.continuous=true
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
key.converter=org.apache.kafka.connect.converters.ByteArrayConverter

If you’re restoring to AVRO in your restore Topic, make sure you update the Key/Value Schema for the topic to AVRO before you configure the restore:

image

You should be able to update your configure to remove ByteArrayConverter

connector.class=io.lenses.streamreactor.connect.aws.s3.source.S3SourceConnector
connect.s3.kcql=INSERT INTO `adamTest2` SELECT * FROM `shippositions:adamTest` STOREAS `AVRO` PROPERTIES ('store.envelope' = true)
tasks.max=1
name=adamTest2-R-2023-11-09-22-02-47-783
connect.s3.partition.search.continuous=true