kafbat UI
GithubDiscord
  • 🎓Overview
    • About
    • Features
    • Getting started
  • 🛣️Project
    • Code of Conduct
    • Roadmap
  • 🧱Development
    • 🤝🏻Contributing
    • Setting up git
    • Building
      • Prerequisites
      • With Docker
      • Without Docker
    • WIP: Testing
  • ⚡Quick Start
    • 🔍Prerequisites
      • Kafka Permissions
        • Standalone Kafka ACLs
        • MSK (+Serverless) Setup
    • Demo run
    • AWS Marketplace
    • Persisting config
    • K8s / Helm
  • 🛠️Configuration
    • Configuration wizard
    • Configuration file
    • Setup example configs
    • Helm charts
      • Quick start
      • Configuration
        • SSL example
      • Resource limits
      • Sticky sessions
    • Misc configuration properties
    • Complex configuration examples
      • Kraft mode + multiple brokers
    • Kafka secured with SSL
    • Authentication
      • For the UI
        • Basic Authentication
        • OAuth2
        • LDAP / Active Directory
        • SSO Guide (Deprecated)
      • For Kafka
        • AWS IAM
        • SASL_SCRAM
    • RBAC (Role based access control)
      • Supported Identity Providers
    • Data masking
    • Audit log
    • Serialization / SerDe
    • OpenDataDiscovery Integration
  • ❓FAQ
    • Common problems
    • FAQ
    • Authentication Issues
Powered by GitBook
On this page
  • How to configure SSO
  • Step 1
  • Step 2
  • Step 3
  • Step 4 (Load Balancer HTTP) (optional)
  • Step 5 (Azure) (optional)

Was this helpful?

Edit on GitHub
Export as PDF
  1. Configuration
  2. Authentication
  3. For the UI

SSO Guide (Deprecated)

Deprecated. See OAuth2 guides

PreviousLDAP / Active DirectoryNextFor Kafka

Last updated 9 days ago

Was this helpful?

How to configure SSO

SSO require additionaly to configure TLS for application, in that example we will use self-signed certificate, in case of use legal certificates please skip step 1.

Step 1

At this step we will generate self-signed PKCS12 keypair.

mkdir cert
keytool -genkeypair -alias ui-for-apache-kafka -keyalg RSA -keysize 2048 \
  -storetype PKCS12 -keystore cert/ui-for-apache-kafka.p12 -validity 3650

Step 2

Create new application in any SSO provider, we will continue with .

After that need to provide callback URLs, in our case we will use https://127.0.0.1:8080/login/oauth2/code/auth0

This is a main parameters required for enabling SSO

Step 3

To launch Kafbat UI with enabled TLS and SSO run following:

docker run -p 8080:8080 -v `pwd`/cert:/opt/cert -e AUTH_TYPE=LOGIN_FORM \
  -e SECURITY_BASIC_ENABLED=true \
  -e SERVER_SSL_KEY_STORE_TYPE=PKCS12 \
  -e SERVER_SSL_KEY_STORE=/opt/cert/ui-for-apache-kafka.p12 \
  -e SERVER_SSL_KEY_STORE_PASSWORD=123456 \
  -e SERVER_SSL_KEY_ALIAS=ui-for-apache-kafka \
  -e SERVER_SSL_ENABLED=true \
  -e SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_AUTH0_CLIENTID=uhvaPKIHU4ZF8Ne4B6PGvF0hWW6OcUSB \
  -e SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_AUTH0_CLIENTSECRET=YXfRjmodifiedTujnkVr7zuW9ECCAK4TcnCio-i \
  -e SPRING_SECURITY_OAUTH2_CLIENT_PROVIDER_AUTH0_ISSUER_URI=https://dev-a63ggcut.auth0.com/ \
  -e SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_AUTH0_SCOPE=openid \
  -e TRUST_STORE=/opt/cert/ui-for-apache-kafka.p12 \
  -e TRUST_STORE_PASSWORD=123456 \
ghcr.io/kafbat/kafka-ui

In the case with trusted CA-signed SSL certificate and SSL termination somewhere outside of application we can pass only SSO related environment variables:

docker run -p 8080:8080 -v `pwd`/cert:/opt/cert -e AUTH_TYPE=OAUTH2 \
  -e SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_AUTH0_CLIENTID=uhvaPKIHU4ZF8Ne4B6PGvF0hWW6OcUSB \
  -e SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_AUTH0_CLIENTSECRET=YXfRjmodifiedTujnkVr7zuW9ECCAK4TcnCio-i \
  -e SPRING_SECURITY_OAUTH2_CLIENT_PROVIDER_AUTH0_ISSUER_URI=https://dev-a63ggcut.auth0.com/ \
  -e SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_AUTH0_SCOPE=openid \
ghcr.io/kafbat/kafka-ui

Step 4 (Load Balancer HTTP) (optional)

Step 5 (Azure) (optional)

For Azure AD (Office365) OAUTH2 you'll want to add additional environment variables:

docker run -p 8080:8080 \
        -e KAFKA_CLUSTERS_0_NAME="${cluster_name}"\
        -e KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS="${kafka_listeners}" \
        -e KAFKA_CLUSTERS_0_ZOOKEEPER="${zookeeper_servers}" \
        -e KAFKA_CLUSTERS_0_KAFKACONNECT_0_ADDRESS="${kafka_connect_servers}"
        -e AUTH_TYPE=OAUTH2 \
        -e AUTH_OAUTH2_CLIENT_AZURE_CLIENTID="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" \
        -e AUTH_OAUTH2_CLIENT_AZURE_CLIENTSECRET="somesecret" \
        -e AUTH_OAUTH2_CLIENT_AZURE_SCOPE="openid" \
        -e AUTH_OAUTH2_CLIENT_AZURE_CLIENTNAME="azure" \
        -e AUTH_OAUTH2_CLIENT_AZURE_PROVIDER="azure" \
        -e AUTH_OAUTH2_CLIENT_AZURE_ISSUERURI="https://login.microsoftonline.com/{tenant_id}/v2.0" \
        -e AUTH_OAUTH2_CLIENT_AZURE_JWKSETURI="https://login.microsoftonline.com/{tenant_id}/discovery/v2.0/keys" \
        -d ghcr.io/kafbat/kafka-ui"

Note that scope is created by default when Application registration is done in Azure portal. You'll need to update application registration manifest to include "accessTokenAcceptedVersion": 2

If you're using load balancer/proxy and use HTTP between the proxy and the app, you might want to set server_forward-headers-strategy to native as well (SERVER_FORWARDHEADERSSTRATEGY=native), for more info refer to .

🛠️
this issue
Auth0