Kubernetes logs not found in default locations? Kubernetes logs not found in default locations? kubernetes kubernetes

Kubernetes logs not found in default locations?


Docker logs only contain the logs that are dumped on STDOUT by your container's process with PID 1 (your container's entrypoint or cmd process).

If you want to see the logs via kubectl logs or docker logs, you should redirect your application logs to STDOUT instead of file /tmp/spring.log. Here's an excellent example of how this can achieved with minimal effort.


Alternatively, you can also use hostPath volumeMount. This way, you can directly access the log from the path on the host.

Warning when using hostPath volumeMount

If the pod is shifted to another host due to some reason, you logs will not move along with it. A new log file will be created on this new host at the same path.


If you are searching for the actual location of the logs outside the containers (and on the host nodes of the cluster), this depends on a couple things. I suppose you are using Docker to run your containers under Kubernetes, which is the most common setup.

On each node of your Kubernetes cluster, you can use the following command to check what is the logging driver being currently used:

docker info | grep -i logging

The default value should be json-file, which means that logs are being written as jsons from the containers, to a certain location on your host nodes.

If you find another driver, such as for example journald, then that means Docker logging driver is sending logs directly to the systemd journal. There are many logging drivers, so as a first check, you should be sure that all yours Kubernetes nodes are configured to log as json files (or, in the way you need to harvest them).


Once this is done, you can start checking where your containers are logging their own log. Choose a Pod to analyze, then:

Identify on which Kubernetes node it is running on

kubectl get pod pod-name -owide

Grab the container ID with something like the following

kubectl get pod pod-name -ojsonpath='{.status.containerStatuses[0].containerID}'

Where the id should be something in the shape of docker://f834508490bd2b248a2bbc1efc4c395d0b8086aac4b6ff03b3cc8fd16d10ce2c

Remove the docker:// part and SSH on the Kubernetes node on which this container is running, then do a

docker inspect container-id | grep -i logpath

Which should give you the log locations for that particular container. You can try tail on the file to check if the logs are really there or not.


In my case, the container I tried this procedure on, was logging inside:

/var/lib/docker/containers/289271086d977dc4e2e0b80cc28a7a6aca32c888b7ea5e1b5f24b28f7601ff63/289271086d977dc4e2e0b80cc28a7a6aca32c888b7ea5e1b5f24b28f7601ff63-json.log