Connect Header Conversion Error

Hello - I have started to enable lenses s3 sink (4.2.0) wider throughout our cluster today and have hit the following error:

Caused by: org.apache.kafka.connect.errors.ConnectException: org.apache.kafka.connect.errors.ConnectException: Unsupported record 5.920217488E+7826:java.math.BigDecimal
  at com.datamountaineer.streamreactor.common.errors.ThrowErrorPolicy.handle(ErrorPolicy.scala:61)
  at com.datamountaineer.streamreactor.common.errors.ErrorHandler.handleError(ErrorHandler.scala:84)
  at com.datamountaineer.streamreactor.common.errors.ErrorHandler.handleTry(ErrorHandler.scala:63)
  at com.datamountaineer.streamreactor.common.errors.ErrorHandler.handleTry$(ErrorHandler.scala:44)
  at io.lenses.streamreactor.connect.aws.s3.sink.S3SinkTask.handleTry(S3SinkTask.scala:46)
  at io.lenses.streamreactor.connect.aws.s3.sink.S3SinkTask.put(S3SinkTask.scala:134)
  at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:601)
  ... 10 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Unsupported record 5.920217488E+7826:java.math.BigDecimal
  at io.lenses.streamreactor.connect.aws.s3.sink.conversion.ValueToSinkDataConverter$.apply(ValueToSinkDataConverter.scala:46)
  at io.lenses.streamreactor.connect.aws.s3.sink.conversion.HeaderToStringConverter$.headerValueToString(HeaderToStringConverter.scala:31)
  at io.lenses.streamreactor.connect.aws.s3.sink.conversion.HeaderToStringConverter$.$anonfun$apply$1(HeaderToStringConverter.scala:27)
  at scala.collection.StrictOptimizedIterableOps.map(StrictOptimizedIterableOps.scala:100)
  at scala.collection.StrictOptimizedIterableOps.map$(StrictOptimizedIterableOps.scala:87)
  at scala.collection.convert.JavaCollectionWrappers$JIterableWrapper.map(JavaCollectionWrappers.scala:62)
  at io.lenses.streamreactor.connect.aws.s3.sink.conversion.HeaderToStringConverter$.apply(HeaderToStringConverter.scala:26)
  at io.lenses.streamreactor.connect.aws.s3.sink.S3SinkTask.$anonfun$put$4(S3SinkTask.scala:153)
  at io.lenses.streamreactor.connect.aws.s3.sink.S3SinkTask.$anonfun$put$4$adapted(S3SinkTask.scala:144)
  at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:575)
  at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:573)
  at scala.collection.AbstractIterable.foreach(Iterable.scala:933)
  at io.lenses.streamreactor.connect.aws.s3.sink.S3SinkTask.$anonfun$put$1(S3SinkTask.scala:144)
  at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
  at scala.util.Try$.apply(Try.scala:210)

We have an SMT that adds the header field to the message itself when stored so we are not bothered about lenses not trying to convert them, so I tried setting store.envelope.fields.headers: false however the error still occurs.
Any insight into why it’s happening or what I could do to fix/avoid it would be appreciated :slightly_smiling_face:

Hi James,

Currently this connector doesn’t have support for these extended logical types. We have added this to our plan and will update you when this has been implemented.

Hi @James_F ,

Since November we have added the BigDecimal support to the S3 sink connector.

Hope you find this useful!