55 results found
-
Support for "stretched" cluster across multi regions/clouds for Aiven for Apache Kafka
As a developer / SRE,
I want to have my Apache Kafka cluster to run across multiple regions and potentially even clouds,
so that I can ensure high availability setup and have near zero RTO and RPO in case of DR, i.e. network, region or provider failure.
In addition, I want to benefit from the geo-distributed cluster setup and read/write from/to the geographically closest broker to optimise my network latency and cost.55 votes -
Allow inspected custom code (JARs) in Apache Flink and Apache Kafka Connect
As a developer,
I want to be able to upload my custom code (JARs),
so that I can use it to build sophisticated or non-standard use-cases in Flink and Kafka Connect,
In addition, I can keep custom implementation private.52 votes -
Programmatic Apache Kafka Consumer (group) management
As a developer,
I want to programmatically manage my consumers and consumer groups,
so that I can see the status of them, perform CRUD operations, show members of a group, reset offsets and similar.
In addition, I want to be able to do so also in Aiven Console.19 votes -
Schema Validation on Apache Kafka broker side
As a developer,
I want to make sure schema is being validated not just on client but also on a broker side,
so that I can make sure all messages in the topic correspond to the same schema and does not contain any mixed schemas.18 votes -
Support for public CA for SASL for Aiven for Apache Kafka
As developer,
I want to use public CA (i.e. Let's Encrypt),
so that I can connect to my Apache for Kafka service without installing any additional certificates.
In addition, I can trust publish authority issuing certificates instead of validating 3rd party certificates.16 votes -
Support Grouping of OAuth2/OIDC Users
As a platform engineer
I want to group multiple users based on their role (OAuth2/OIDC claim)
so that I can reduce the number of required Kafka users and ACL entries that need to be managed.Currently, every user / identity connecting via OAuth2/OIDC has a 1:1 mapping to a Kafka user (the username is taken from the sub claim). This is cumbersome and leads to significant overhead if for example multiple identities / users with the same permissions want to access the Kafka service. Kafka users and ACLs need to be created for every single identity, even though they share…
11 votes -
Support Databricks driver for Apache Kafka Connect JDBC sink connector
As a developer,
I want to have support for Databricks driver for Apache Kafka Connect JDBC sink connector,
so that I can write data from Aiven for Apache Kafka to Databricks Spark for further processing, analysis and consumption.11 votes -
Audit logging as self-service option
As a developer,
I want to enable audit logging for my service,
so that I can keep track of breadcrumbs left by all the changes made to a service to ensure compliance.11 votes -
Temporarily disable service user on Aiven for Apache Kafka
As a developer,
I want to temporarily disable service user,
so that I can test is service user still in use by any of the applications or temporarily ban abusive clients.11 votes -
Capture and expose "last used" date for service users
As a developer / SRE,
I want to know when service users were last used,
so that I can know is service user still in use or was abandoned and could be removed.11 votes -
AWS SQS Source and Sink connector
As a developer,
I want to read and write data in AWS SQS
so that I can enable more architectural solutions and use SQS with other AWS services.11 votes -
Karapace OAuth2 support
As a developer / security engineer,
I want to be able to authenticate (i.e. Azure AD) and authorize (OAuth) my service users using OAuth2,
so that I can manage all users in a central place and don't need to worry about missing something during on- and off- boarding.10 votes -
Configurable maximum validity for service user certificates
As a developer / security engineer,
I want to be able to define lifetime of certificates (i.e. 30-90 days) used to authenticate my service users,
so that I can have certificate rotation policies in place to ensure compliance and security best practices.10 votes -
Enforcing naming convention for Apache Kafka topics
As a SRE / Apache Kafka operator / developer,
I want to make sure all my topics adhere to a naming convention,
so that I can ensure consistency of my topics.
In addition, I can use naming convention to better identify, group, locate and categorise my topics.10 votes -
Support for Apache Kafka Connect to be run standalone against a 3rd party Apache Kafka service
As a developer,
I want to run standalone Apache Kafka Connect cluster against 3rd party Apache Kafka service,
so that I can benefit from managed Apache Kafka Connect service and read/write data from/to Apache Kafka service running outside Aiven.10 votes -
Schema references support for AVRO schema in Karapace
As data engineer, developer
I want to be able use schema references in AVRO schemas in Karapace
so that I can define and reuse complex data structures or types within other schemas8 votes -
Support for Protobuf serialization of Events (and Keys) in Aiven Kafka Connnect
As a data engineer
I want Aiven Kafka Connect to offer the option to use the protocol buffer data format when serializíng the events that is sends to a Kafka broker.
In my specific case, I need it to be possible in a Debezium connector for PostgreSQL.
Additionally, it would be good to have as a user the option to define oneself the protobuf schema to use for serializing.7 votes -
Tagging functionality for Apache Kafka ACLs
As a developer,
I want to tag Apache Kafka ACLs,
so that I can group, search and identify what my ACLs are used for.7 votes -
Add MaxTimestampSpec data to the topic/partition details from the Aiven API
As a developer,
I want to know latest offset and max timestamp for a given topic/partition,
so that I can identify when topic/partition was last time used and in case it is no longer in use I can cleanup stale topics/partitions.6 votes -
Support for cross-project and cross-vpc service integration for all services in streaming domain
As a developer,
I want to integrate my services in streaming domain, i.e. Apache Kafka, Apache Kafka Connect, Apache MirrorMaker2, etc with other services in any project and/or VPC,
so that I can build data streaming pipelines and benefit from the use use of aforementioned services.6 votes
- Don't see your idea?