95 results found
-
Support for "stretched" cluster across multi regions/clouds for Aiven for Apache Kafka
As a developer / SRE,
I want to have my Apache Kafka cluster to run across multiple regions and potentially even clouds,
so that I can ensure high availability setup and have near zero RTO and RPO in case of DR, i.e. network, region or provider failure.
In addition, I want to benefit from the geo-distributed cluster setup and read/write from/to the geographically closest broker to optimise my network latency and cost.55 votes -
Allow inspected custom code (JARs) in Apache Flink and Apache Kafka Connect
As a developer,
I want to be able to upload my custom code (JARs),
so that I can use it to build sophisticated or non-standard use-cases in Flink and Kafka Connect,
In addition, I can keep custom implementation private.52 votes -
Certificate file support for Kafka connectors
As a developer / security engineer,
I want to be able to authenticate my Apache Kafka connectors via mTLS,
so that I connect to my external services in a secure way.24 votes -
Programmatic Apache Kafka Consumer (group) management
As a developer,
I want to programmatically manage my consumers and consumer groups,
so that I can see the status of them, perform CRUD operations, show members of a group, reset offsets and similar.
In addition, I want to be able to do so also in Aiven Console.19 votes -
Schema Validation on Apache Kafka broker side
As a developer,
I want to make sure schema is being validated not just on client but also on a broker side,
so that I can make sure all messages in the topic correspond to the same schema and does not contain any mixed schemas.18 votes -
Run Karapace as dedicated managed service
As a developer,
I want to run Karapace as fully managed dedicated service,
so that I can use it with Apache Kafka running on and outside of Aiven.
In addition, I can use same Karapace service against multiple Apache Kafka services.17 votes -
Support for public CA for SASL for Aiven for Apache Kafka
As developer,
I want to use public CA (i.e. Let's Encrypt),
so that I can connect to my Apache for Kafka service without installing any additional certificates.
In addition, I can trust publish authority issuing certificates instead of validating 3rd party certificates.16 votes -
Support for AWS S3 Source Kafka Connector
As a developer / data engineer,
I want to be able to read data stored in AWS S3 bucket,
so that I can transfer, process and transform that data to other applications.
In addition, I can use data stored in S3 bucket as backup and I can rehydrate my Apache Kafka with it.16 votes -
Custom Kafka Connectors on Aiven (Bring your own Connector)
As Developer, I want to utilize Custom Connectors on Aiven,
so that I can integrate our proprietary data systems and custom applications with Apache Kafka without having to manage the underlying infrastructure.In addition, this will allow developers to concentrate more on the building of business-critical applications instead of getting tied up with infrastructural tasks.
15 votes -
Kafka versioned updates
As a platform engineer
I want the service updates to be versioned and to be able to select a specific version (Kafka, Karapace, etc) to update to so that I can perform correct change promotion from non-prod to production (instead of being forced to always apply the latest) and so that if a new version is released while an update is in-flight it would not lead to different versions running across the brokers (as is currently the case, which we have experienced can lead to incompatibilities). Additionally, I would like it to be clear what version(s) is currently running in…12 votesThis is partially being realised with users being able to view what service is available and what update will take place.
-
Support Grouping of OAuth2/OIDC Users
As a platform engineer
I want to group multiple users based on their role (OAuth2/OIDC claim)
so that I can reduce the number of required Kafka users and ACL entries that need to be managed.Currently, every user / identity connecting via OAuth2/OIDC has a 1:1 mapping to a Kafka user (the username is taken from the sub claim). This is cumbersome and leads to significant overhead if for example multiple identities / users with the same permissions want to access the Kafka service. Kafka users and ACLs need to be created for every single identity, even though they share…
11 votes -
Support Databricks driver for Apache Kafka Connect JDBC sink connector
As a developer,
I want to have support for Databricks driver for Apache Kafka Connect JDBC sink connector,
so that I can write data from Aiven for Apache Kafka to Databricks Spark for further processing, analysis and consumption.11 votes -
Audit logging as self-service option
As a developer,
I want to enable audit logging for my service,
so that I can keep track of breadcrumbs left by all the changes made to a service to ensure compliance.11 votes -
Temporarily disable service user on Aiven for Apache Kafka
As a developer,
I want to temporarily disable service user,
so that I can test is service user still in use by any of the applications or temporarily ban abusive clients.11 votes -
Capture and expose "last used" date for service users
As a developer / SRE,
I want to know when service users were last used,
so that I can know is service user still in use or was abandoned and could be removed.11 votes -
AWS SQS Source and Sink connector
As a developer,
I want to read and write data in AWS SQS
so that I can enable more architectural solutions and use SQS with other AWS services.11 votes -
Karapace OAuth2 support
As a developer / security engineer,
I want to be able to authenticate (i.e. Azure AD) and authorize (OAuth) my service users using OAuth2,
so that I can manage all users in a central place and don't need to worry about missing something during on- and off- boarding.10 votes -
Configurable maximum validity for service user certificates
As a developer / security engineer,
I want to be able to define lifetime of certificates (i.e. 30-90 days) used to authenticate my service users,
so that I can have certificate rotation policies in place to ensure compliance and security best practices.10 votes -
Enforcing naming convention for Apache Kafka topics
As a SRE / Apache Kafka operator / developer,
I want to make sure all my topics adhere to a naming convention,
so that I can ensure consistency of my topics.
In addition, I can use naming convention to better identify, group, locate and categorise my topics.10 votes -
Support for Apache Kafka Connect to be run standalone against a 3rd party Apache Kafka service
As a developer,
I want to run standalone Apache Kafka Connect cluster against 3rd party Apache Kafka service,
so that I can benefit from managed Apache Kafka Connect service and read/write data from/to Apache Kafka service running outside Aiven.10 votes
- Don't see your idea?