Skip to content

Event Streaming

Join our forum to discuss your ideas with Aiven community or check out our public roadmap.

Event Streaming

Categories

JUMP TO ANOTHER FORUM

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback

94 results found

  1. As a Kafka Administrator
    I want to ahve ability to scalse Kafka clsuter without upgrade, that increses time of scale.
    In production this can become critical and lead to downtime. For example last 2 times the scale lasted for 12 hours.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  2. Our customer Jago wants to monitor the status of connectors and tasks but currently can't find relevant metrics to do so. They want to be able to monitor the status of connectors and tasks on a dashboard and also get notified whenever a connector has not been running for X minutes.

    Jago has a connector running but can not find the metrics for kafka.connect:type=connector-metrics,connector=*.

    The specific metrics they are looking for is the one related to the status of a connector. For example, in the customers current self-managed kafka connect, they have the following metrics. This is convenient because they…

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. As a developer that uses Aiven's S3 sink connector,
    I want to be able to set the offset.flush.interval.ms only for my specific connector from the connector's configuration
    so that I can avoid configuring it in the cluster level (for all connectors).

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. As a non-technical user
    I want to be able to aggregate and join different streams of data
    without the need of developers

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    Shelved  ·  Jonah Kowall responded

    This function will largely be replaced by ChatGPT or other LLMs which can generate clear code and instructions, making a visual builder unnecessary. 

  5. As a developer
    I want to know when a connector is paused or resumed
    so that I can have timestamps and know if anybody is doing what they are not supposed to do.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. As an application developer
    I want to compress my kafka messages, but be able to decomrpess them using a transform before sinking them into a destination
    so that I can save on storage costs
    In addition, I'd like to use ZTSD, but more common libraries might be enough.

    Note, Confluent has something similar :
    https://docs.confluent.io/platform/current/connect/transforms/gzipdecompress.html

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. As a developer I want to be able to read and write data to my S3 object storage, in order to simply integrate Flink into my existing data architecture. Using Flink to read data from S3, transform it, and then write to another S3 location allows easy consolidation and data quality management in a common reference data architecture.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. As a Cloud platform engineer, I need to have an automated way to set up and update quota configurations on a cluster, taking into account changes in resources consumption patterns amongst producers and consumers.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. TL;DR;
    As Aiven Customer
    I want to be able to be able to encrypt any form of sensitive data (PII or PCI) so that I can manage sensitive data in a legally compliant and user-privacy respectful manner.

    Detailed description of the proposal:
    Hi Aiven!
    Hope all is well with you. I have a feature/service suggestion which I believe will make your existing product portfolio even stronger!

    It’s a thing I call “Aiven for Privacy FTW!” and it’s basically a standalone “field-level-encryption” service used for managing of PII and PCI fields/properties in the event payload in legally (eg. GDPR and/or CCPA)…

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Other  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    Shelved  ·  Jonah Kowall responded

    I will be closing this out, but the idea is valid. We are looking at building a proxy service for Kafka and this could be in part of the roadmap for that component as you have described. We suggest for those who want to use encryption on Kafka to do this on the producer and consumer sides as the data would be encrypted from end to end. 

  10. It would be great it the io.debezium.transforms.partitions.PartitionRouting SMT was available for use when configuring a Kafka Debezium source connector on the Aiven platform.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. As a devops developer
    I want to change the replication policy separator
    so that I can use our existing topic separators without conflicting with MM2.
    In addition, I find it extremely inconvenient to have to change our existing schema because we cannot configure the replication.policy.separator in MM2.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We can see it as useful configuration to expose. At the same time it will need to gather a bit more interest to be taken in to development. 

    Thanks!

  12. As a developer
    I want to be able to publish schemaless JSON messages to Big Query and have the BigQuery schema be updated to reflect those changes
    so that I can evolve my message schema without breaking my pipeline.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. As a data engineer,
    I want to write data to Rockset,
    so that I can run my analytical workloads.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. As a developer
    I want to use Apache NiFi Kafka Connector
    so that I can move data in and out Apache NiFi to orchestrate data flow.

    1 vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
1 2 3 5 Next →
  • Don't see your idea?