AWS MSK Connector Question

Is there a way to restart a running s3 sink connector so it captures any changes made to the topic and saves it to s3?

If not, is a new connector required to capture the data changes in the topic?

Hi cwoodruff,

Can you please explain your scenario? A Kafka topic is immutable, once the data is written it is not overwritten. So if the connector runs it will consume the data from the oldest stored record to the newest.

Best,
Stefan

@stheppi I created a sink connector to backup a topic to S3 and left the connector running. If new data is sent to the topic after the initial backup to S3, will it also be backed up to S3, or does another sink connector have to be created?

A Kafka connector sink functions as a consumer of Kafka data, actively ingesting any records added to the specified topic. Once configured and running, it continually consumes new data from the Kafka topic. As long as the connector remains active, the data stream is continuously backed up to Amazon S3, ensuring that no data is lost or missed. It provides a seamless and reliable mechanism for persisting Kafka data to S3 without the need for manual intervention.