Data Protection Unblock Revenue. Protect Cash Flow. Prove Compliance.
Conduktor protects data at the wire level and governs access control centrally, so security is a feature, not a blocker.

Kafka Scales. Security Doesn't Keep Up.
Data flows through Kafka in plaintext. Access controls stop at the topic level. Teams build their own encryption. And when an auditor asks who accessed what, nobody has a clear answer.
Revenue stuck on security approvals
There's no centralized way to prove data is encrypted or access is governed. InfoSec ends up manually reviewing every new application, delaying data initiatives by months.
Margins shrink from duplicated work
Each team builds its own encryption and manages its own access controls. Over time, implementations drift, fields get missed, and a single gap can escalate into an organization-wide incident.
Cash flow at risk from fines
GDPR fines reach 4% of global revenue and HIPAA violations run up to $2M per category. Meanwhile, Kafka gives you topic-level ACLs and no audit trail of who accessed what data.
Encryption at every level
Full-payload, field-level, and header-level encryption with eight algorithms including AES-GCM and ChaCha20-Poly1305. One policy, every application, zero code changes. Plug in your own KMS.
Learn more →Crypto-shredding for GDPR
Encrypt each record with a per-subject key. When a user invokes the right to erasure, destroy the key and every record becomes permanently unreadable. No topic deletion, no reprocessing.
Learn more →Dynamic data masking
Mask fields per consumer at the wire level. A support team sees masked credit card numbers. A fraud team sees full data.
Learn more →Tokenization
Format-preserving tokens via HashiCorp Vault Transform. Run analytics on protected fields without exposing the real values.
Learn more →Cryptographic signing
HMAC-SHA256 message signing at the gateway. Consumers verify integrity on fetch. Tampered payloads get flagged, meeting chain-of-custody requirements.
Learn more →Schema-aware protection
Mark sensitive fields in Avro, JSON, or Protobuf schemas. Gateway encrypts them automatically on produce and decrypts on authorized fetch, no client changes.
Learn more →Data quality enforcement
Validate payloads against CEL or SQL rules at the proxy. Block or redirect non-compliant messages before they reach consumers.
Learn more →Application audit trail
Every application action logged: which service produced where, which consumer read what. IP, timestamp, and payload.
Learn more →RBAC & SSO
SSO via OIDC and LDAP. User and group-based roles. Assign permissions at the resource level.
Learn more →Data masking
Mask PII and sensitive fields per user group in the Console UI. Different teams see different data.
Learn more →User audit trail
Every user action logged with IP, timestamp, and payload. 70+ event types across Kafka, IAM, and governance actions.
Learn more →ACL & service account management
Create and manage Kafka ACLs and service accounts directly. View, edit, and delete access rules across clusters.
Learn more →How they work together
Console governs who and what can access your resources. Gateway governs what happens to the data as it flows.
Console manages access for people (RBAC, SSO, masking) and applications (ACLs, service accounts). Gateway protects data in motion with encryption, tokenization, crypto-shredding, and cryptographic signing.
Console's audit trail covers user and admin actions. Gateway's covers application-level activity. Together you can show who accessed what and that data is encrypted and validated.
Encryption comparison
| Feature | Cluster-side encryption | Client-side field level | Conduktor encryption |
|---|---|---|---|
| Encryption type | In-transit and at-rest (not end-to-end) | In-transit and at-rest | In-transit and at-rest |
| Granularity | Entire payload | Field-level | Full-payload, field-level, or header-level |
| Implementation | Requires configuration changes | Requires changes on each client | Configure once, applied everywhere |
| Regulatory compliance | Limited (in-transit only) | Better for PII and sensitive data | PCI-DSS, GDPR, HIPAA |
| Multi-cluster | Depends on cluster setup | Limited to clusters on MSP | Works across any Kafka vendor |
| Audit readiness | Minimal | Moderate | Centralized policy visibility |
| Latency impact | Low | Moderate, depends on client processing | Low, optimized at the proxy |
HashiCorp Vault
Secret management, key rotation, and tokenization via Transform Engine.
AWS KMS
Native integration with AWS Key Management Service.
Azure Key Vault
Microsoft Azure's managed key solution.
Google Cloud KMS
Google Cloud Key Management for encryption keys.
Fortanix DSM
HSM-backed key management with confidential computing for regulated workloads.
85% faster security implementation
Per project vs. custom build. InfoSec signs off once, not once per team.
$200K+ annual savings
No more custom encryption builds. Fewer incidents from bad data. Less time on audit prep.
500+ hours saved per year
Across encryption implementation, access management, and audit preparation.
75% faster team onboarding
SSO, RBAC, and self-service ACLs mean new teams get access in minutes, not weeks of ticket queues.
vs. Custom encryption per team
Implementations drift. Fields get missed. Conduktor encrypts once at the wire level across all applications.
vs. Native Kafka ACLs
Topic-level only. No field masking, no encryption, no audit trail of who accessed what. Conduktor adds all three.
vs. Managing access manually
SSH access, shared credentials, and ticket queues for every permission change. Conduktor gives you SSO, RBAC, and self-service ACL management.
Frequently asked questions
What's the difference between Conduktor encryption and Kafka's built-in TLS?
Kafka's TLS only encrypts data in transit between brokers and clients. Once data reaches the broker it's decrypted, and it sits in plaintext on disk. Anyone with broker or disk access can read it. Conduktor encrypts the payload (or specific fields) before it reaches Kafka, using your KMS keys. Data at rest stays as ciphertext. TLS protects the wire; Conduktor protects the data itself.
Does Conduktor encryption require code changes?
No. Gateway encrypts at the wire level by intercepting messages before they reach Kafka. Producers and consumers keep using standard Kafka clients. Nothing changes on their end.
Can different consumers see different data from the same topic?
Yes. Gateway masks fields dynamically based on consumer identity. A support team sees masked credit card numbers. A fraud team sees the full data. Same topic.
How do I integrate with my existing KMS?
Gateway supports HashiCorp Vault, AWS KMS, Azure Key Vault, Google Cloud KMS, and Fortanix DSM. Configure the connection once. Gateway handles key retrieval from there. Vault's Transform Engine is also supported for tokenization.
How do you handle GDPR right-to-erasure when Kafka is append-only?
Crypto-shredding. Gateway encrypts records with a per-subject key (say, one key per customer ID). When a user invokes their right to be forgotten, you destroy that key. Every record encrypted with it becomes permanently unreadable. No topic deletion, no reprocessing, no reading every log to scrub PII.
What compliance frameworks does this cover?
Console's audit logs and RBAC address SOC2, ISO 27001, and GDPR access control requirements. Gateway's encryption and masking address PCI-DSS, HIPAA, and GDPR encryption mandates. One covers who did what. The other covers how data is protected.
Does this work with Confluent Cloud, AWS MSK, and self-managed Kafka?
Yes. Works with any Kafka distribution. The security layer sits in front of your cluster regardless of the vendor.
How does access control work?
Console manages access for people (SSO, RBAC by user or group, UI masking) and applications (ACLs, service accounts). Gateway adds virtual cluster isolation, centralized authentication, and per-tenant policies. Both produce their own audit trails.
I have other questions?
Drop us a line and we'll get back to you.
Ready to protect your Kafka data?
See how Conduktor handles encryption, access controls, and audit trails across your Kafka infrastructure.
