Access Kubernetes service from vm in same GCE project Access Kubernetes service from vm in same GCE project kubernetes kubernetes

Access Kubernetes service from vm in same GCE project


It's not entirely clear what you mean by "make the service accessible" and "without exposing the service publicly". You mean you want to be able to reach your service from GCE, outside a GKE cluster, but not from outside GCP?

If that's what you mean, please take a look at externalIP field in the serviceSpec - http://kubernetes.io/docs/user-guide/services/#external-ips. You can assign an external IP address to your service and then install the right firewall rules.


Maybe it's too late, but I am also facing the same issue and I have a partial solution.

I didn't want my service to be exposed publicly, so no ingress or node-port service for me.

In GKE the Kubernetes service virtual IP range is only accessible within the cluster by default.

But you can make it accessible outside cluster by creating a GCE route that allows you to forwards kubernetes service's virtual IP range to any one of the kubernetes nodes, like below.

gcloud compute routes create --destination-range 10.x.x.x/20 --next-hop-instance kube-node-0

This is obviously a hack as this will fail in cases, the node where's all service traffic is going, fails for any reason (cluster upgrade/kube version upgrade etc.).

The solution, seems to me, is to have a external node that autodiscovers kubernetes nodes and loadbalances the external service traffic, though I haven't gotten it to work yet.

I found this talk to be extremely useful in understanding kubernetes and GKE networking.