Skip to content

Event Streaming

  • Aiven for Apache Kafka® - Apache Kafka as a fully managed service, deployed in the cloud of your choice and a full set of capabilities to build your streaming data pipelines. Find more info in our documentation or developer center.
  • Aiven for Apache Kafka® Connect - Seamlessly transport data with Kafka, integrate with external systems using Kafka Connect
  • Aiven for Apache Kafka® MirrorMaker 2 - replicate data with MirrorMaker 2
  • Karapace - easily manage schemas with Karapace
Join our forum to discuss your ideas with Aiven community or check out our public roadmap.

Event Streaming

Categories

JUMP TO ANOTHER FORUM

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback

55 results found

  1. As a developer / SRE,
    I want to have my Apache Kafka cluster to run across multiple regions and potentially even clouds,
    so that I can ensure high availability setup and have near zero RTO and RPO in case of DR, i.e. network, region or provider failure.
    In addition, I want to benefit from the geo-distributed cluster setup and read/write from/to the geographically closest broker to optimise my network latency and cost.

    56 votes
    How important is this to you?
  2. As a developer,
    I want to be able to upload my custom code (JARs),
    so that I can use it to build sophisticated or non-standard use-cases in Flink and Kafka Connect,
    In addition, I can keep custom implementation private.

    52 votes
    How important is this to you?
  3. As a developer,
    I want to programmatically manage my consumers and consumer groups,
    so that I can see the status of them, perform CRUD operations, show members of a group, reset offsets and similar.
    In addition, I want to be able to do so also in Aiven Console.

    26 votes
    How important is this to you?
  4. As a developer,
    I want to make sure schema is being validated not just on client but also on a broker side,
    so that I can make sure all messages in the topic correspond to the same schema and does not contain any mixed schemas.

    22 votes
    How important is this to you?
  5. As developer,
    I want to use public CA (i.e. Let's Encrypt),
    so that I can connect to my Apache for Kafka service without installing any additional certificates.
    In addition, I can trust publish authority issuing certificates instead of validating 3rd party certificates.

    17 votes
    How important is this to you?
  6. As a developer,
    I want to temporarily disable service user,
    so that I can test is service user still in use by any of the applications or temporarily ban abusive clients.

    12 votes
    How important is this to you?
  7. As a developer / SRE,
    I want to know when service users were last used,
    so that I can know is service user still in use or was abandoned and could be removed.

    12 votes
    How important is this to you?
  8. As a platform engineer
    I want to group multiple users based on their role (OAuth2/OIDC claim)
    so that I can reduce the number of required Kafka users and ACL entries that need to be managed.

    Currently, every user / identity connecting via OAuth2/OIDC has a 1:1 mapping to a Kafka user (the username is taken from the sub claim). This is cumbersome and leads to significant overhead if for example multiple identities / users with the same permissions want to access the Kafka service. Kafka users and ACLs need to be created for every single identity, even though they share…

    11 votes
    How important is this to you?
  9. As a developer / security engineer,
    I want to be able to authenticate (i.e. Azure AD) and authorize (OAuth) my service users using OAuth2,
    so that I can manage all users in a central place and don't need to worry about missing something during on- and off- boarding.

    11 votes
    1 comment  ·  Karapace  ·  Admin →
    How important is this to you?
  10. As a developer,
    I want to have support for Databricks driver for Apache Kafka Connect JDBC sink connector,
    so that I can write data from Aiven for Apache Kafka to Databricks Spark for further processing, analysis and consumption.

    11 votes
    How important is this to you?
  11. As a developer,
    I want to enable audit logging for my service,
    so that I can keep track of breadcrumbs left by all the changes made to a service to ensure compliance.

    11 votes
    How important is this to you?
  12. As a developer / security engineer,
    I want to be able to define lifetime of certificates (i.e. 30-90 days) used to authenticate my service users,
    so that I can have certificate rotation policies in place to ensure compliance and security best practices.

    11 votes
    How important is this to you?
  13. As a developer,
    I want to read and write data in AWS SQS
    so that I can enable more architectural solutions and use SQS with other AWS services.

    11 votes
    How important is this to you?
  14. As data engineer, developer
    I want to be able use schema references in AVRO schemas in Karapace
    so that I can define and reuse complex data structures or types within other schemas

    10 votes
    How important is this to you?
  15. As a SRE / Apache Kafka operator / developer,
    I want to make sure all my topics adhere to a naming convention,
    so that I can ensure consistency of my topics.
    In addition, I can use naming convention to better identify, group, locate and categorise my topics.

    10 votes
    How important is this to you?
  16. As a developer,
    I want to run standalone Apache Kafka Connect cluster against 3rd party Apache Kafka service,
    so that I can benefit from managed Apache Kafka Connect service and read/write data from/to Apache Kafka service running outside Aiven.

    10 votes
    How important is this to you?
  17. As a data engineer
    I want Aiven Kafka Connect to offer the option to use the protocol buffer data format when serializíng the events that is sends to a Kafka broker.
    In my specific case, I need it to be possible in a Debezium connector for PostgreSQL.
    Additionally, it would be good to have as a user the option to define oneself the protobuf schema to use for serializing.

    7 votes
    How important is this to you?
  18. As a developer,
    I want to tag Apache Kafka ACLs,
    so that I can group, search and identify what my ACLs are used for.

    7 votes
    How important is this to you?
  19. As a developer,
    I want to know latest offset and max timestamp for a given topic/partition,
    so that I can identify when topic/partition was last time used and in case it is no longer in use I can cleanup stale topics/partitions.

    6 votes
    How important is this to you?
  20. As a developer,
    I want to integrate my services in streaming domain, i.e. Apache Kafka, Apache Kafka Connect, Apache MirrorMaker2, etc with other services in any project and/or VPC,
    so that I can build data streaming pipelines and benefit from the use use of aforementioned services.

    6 votes
    How important is this to you?
← Previous 1 3
  • Don't see your idea?