Troubleshooting Click here for latest

How to address commonly encountered KEDA issues

How can I use KEDA in a proxy network?

If while setting up KEDA, you get an error: (v1beta1.external.metrics.k8s.io) status FailedDiscoveryCheck with a message: no response from https://ip:443: Get https://ip:443: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers).

One of the reason for this can be that you are behind a proxy network.

Check the status:

Find the api service name for the service keda/keda-metrics-apiserver:

kubectl get apiservice --all-namespaces

Check for the status of the api service found in previous step:

kubectl get apiservice <apiservicename' -o yaml

Example:

kubectl get apiservice v1beta1.external.metrics.k8s.io -o yaml

If the status is False, then there seems to be an issue and proxy network might be the primary reason for it.

Solution for self-managed Kubernetes cluster:

Find the cluster IP for the keda-metrics-apiserver and keda-operator-metrics:

kubectl get services --all-namespaces

In the /etc/kubernetes/manifests/kube-apiserver.yaml - add the cluster IPs found in the previous step in no_proxy variable.

Reload systemd manager configuration:

sudo systemctl daemon-reload

Restart kubelet:

sudo systemctl restart kubelet

Check the API service status and the pods now. Should work!

Solution for managed Kubernetes services:

In managed Kubernetes services you might solve the issue by updating firewall rules in your cluster.

E.g. in GKE private cluster add port 6443 (kube-apiserver) to allowed ports in master node firewall rules.

Why does Google Kubernetes Engine (GKE) 1.16 fail to fetch external metrics?

If you are running Google Kubernetes Engine (GKE) version 1.16, and are receiving the following error:

unable to fetch metrics from external metrics API: <METRIC>.external.metrics.k8s.io is forbidden: User "system:vpa-recommender" cannot list resource "<METRIC>" in API group "external.metrics.k8s.io" in the namespace "<NAMESPACE>": RBAC: clusterrole.rbac.authorization.k8s.io "external-metrics-reader" not found

You are almost certainly running into a known issue.

The workaround is to recreate the external-metrics-reader role using the following YAML:

apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
  name: external-metrics-reader
rules:
- apiGroups:
  - "external.metrics.k8s.io"
  resources:
  - "*"
  verbs:
  - list
  - get
  - watch

The GKE team is currently working on a fix that they expect to have out in version >= 1.16.13.