Not able to deploy Keycloak into Kubernetes

I’m implementing a Keycloak service into a Kubernetes cluster, but I’m not able to open the Keycloak login page which is exposed by Kubernetes.

To do that I’m using Docker Desktop and creating a local cluster with “minikube”. As soon as it’s created I have deployed a postgresql service into the cluster using helm. Finally, I uploaded the deployment and service files to kubernetes. After a while everything was created successfully (as I could see in the Kubernetes Dashboard). So I run the command “minikube service keycloak --url” to open the service in my browser. However, the page keeps loading for a while and then it shows an error which says that the url is not reachable.

I don’t know if the problem is with one of the .yml files or that I have to create a proxy or something similar to access to the internal network that minikube creates. I hope someone knows what is wrong, because I’ve tried a lot of things but none of them worked out.

These are my deployment and services files:

__

apiVersion: apps/v1
kind: Deployment
metadata:
name: keycloak
labels:
name: keycloak
app: keycloak
spec:
replicas: 1
selector:
matchLabels:
app: keycloak
template:
metadata:
name: keycloak
labels:
app: keycloak
name: keycloak
spec:
restartPolicy: Always
containers:
- name: keycloak
image: jboss/keycloak:11.0.2
imagePullPolicy: IfNotPresent
ports:
- containerPort: 8080
protocol: TCP
resources:
requests:
cpu: 200m
memory: 256Mi
limits:
cpu: 400m
memory: 512Mi
env:
- name: DB_VENDOR
value: “postgres”
- name: KEYCLOAK_LOGLEVEL
value: “DEBUG”
- name: PROXY_ADDRESS_FORWARDING
value: “true”
- name: KEYCLOAK_USER
value: “admin”
- name: KEYCLOAK_PASSWORD
value: “password”
- name: DB_USER
value: “admin”
- name: DB_PASSWORD
value: “password”
- name: DB_ADDR
value: “keycloak-db-postgresql”
- name: DB_PORT
value: “5432”
- name: DB_DATABASE
value: “keycloak-db”

__

apiVersion: v1
kind: Service
metadata:
name: keycloak
labels:
app: keycloak
name: keycloak
spec:
type: NodePort
ports:
- name: http
protocol: TCP
port: 8080
nodePort: 30080
selector:
app: keycloak
name: keycloak

I met the similar problem, it turns out to be following reason:
The modern web browser will block http/https mixed content from html page, unfortunately the default admin console requires “http://xxx.xxx.xxx/auth/js/keycloak.js”. This happens when Kubernetes Ingress TLS is used and L7 proxy to Keycloak:80 rather than L4 proxy to Keycloak:8443 port.
I regard this either a bug or I missed some configuration to state out the external TLS unload.

I think the key is to let Keycloak aware that it is behind a gateway that can unload TLS connection, therefore use https instead of http, but how? I’m new to the source code and I’ve no idea how to track the root cause further.

Hi @ivan,

can you try adding this two arguments: -Dkeycloak.adminUrl=http://{your_node_base_url}/auth and
-Dkeycloak.frontendUr=http://{your_node_base_url}/auth to keycloak on startup

I saw that there is one env variable to set keycloak.frontendUrl in the official docker image KEYCLOAK_FRONTEND_URL. For keycloak.adminUrl, I think you have to add docker args.

Hope it helps.

1 Like

@ivan @zak hi, I solved this issue by adding PROXY_ADDRESS_FORWARDING to Helm values.yaml. You may find your own configuration depending on your deployment.
This is related to Keycloak in docker behind reverse proxy

extraEnv: |
- name: KEYCLOAK_USER
value: *****
- name: KEYCLOAK_PASSWORD
value: *****
- name: PROXY_ADDRESS_FORWARDING
value: “true”

1 Like

@davinwang, I tried the solution you said adding the extraEnv variables to the values.yml file but it still doesn’t work. I also tried several example deployment files for keycloak that I found on the Internet.

Could it be something related that I’m using Docker Desktop for Windows? If so, I will install Linux on my laptop and try again.