Skip to content

Event Streaming

Join our forum to discuss your ideas with Aiven community or check out our public roadmap.

Event Streaming

Categories

JUMP TO ANOTHER FORUM

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback

10 results found

  1. Kafka consumer clients on Aiven for Kafka should be able to use client.rack configuration introduced in KIP-392

    3 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  2. As a developer / security engineer,
    I want to be able to authenticate (i.e. Azure AD) and authorize (OAuth) my service users using OAuth2,
    so that I can manage all users in a central place and don't need to worry about missing something during on- and off- boarding.

    57 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  3. As a developer / user,
    I want to know which Karapace version is currently running,
    so that I can adapt new features and behaviours faster.
    In addition, I would like to have version exposed in logs, HTTP responses and Aiven Console.

    8 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Karapace  ·  Admin →

    This is now available if you use the service API, and will be available in the Console shortly


    A CL API example:


    avn service get $SERVICE_NAME --json


    Or you can use the GET service information if you use the REST API as documented here https://api.aiven.io/doc/#tag/Service/operation/ServiceGet


    This will return the Karapace version in the metadata section of the response


    "metadata": {

    "kafka_version": "3.6.2",

    "karapace_version": "3.12.0",

    "service_version_end_of_life": "2024-10-18T00:00:00+00:00"

    },


  4. As a developer,
    I want to know how much time will it take for my consumers to catch up when lacking behind,
    so that I can programmatically scale my workloads for faster data processing.

    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  5. As a developer / SRE,
    I want to configure quotas for producers and consumers,
    so that I can make sure there are no rogue Apache Kafka clients exhausting network bandwidth available for the cluster.
    In addition, I can throttle certain clients to simulate and test different network situations and application use-cases, i.e. consumer group auto-scaling.

    29 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  6. As a developer,
    I want to read and write data to Azure Blob Storage,
    so that I can use that data for stream processing and analytics, and backup / restore data in Apache Kafka.

    17 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  7. As a developer,
    I want to sync my data in to Clickhouse,
    so that I can store large volumes of data and run analytics on top of it.

    10 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  8. As DevOps/SecOps engineer,
    I want to securely manage my credentials in Kafka Connect connectors,
    so that I can make sure no credentials are being leaked in connector configuration.
    In addition, I can manage my credentials in Hashicorp Vault and exercise credential rotation without changing my connector configuration.

    27 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  9. As a Service Owner
    I want to enable Schema/REST ACLs
    so that I can control who has write and read access to my data models
    In addition TEST

    28 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  10. I would like to have the option of using low cost storage to keep my historical Kafka data so that I can have recoverability and lower my costs.

    17 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

  • Don't see your idea?