Why a pod can't connect to another network? (In the new version of Kubernetes)
Solution 1
I reported this issue to google here: https://issuetracker.google.com/issues/111986281
And they said that is an issue in Kubernetes 1.9:
Beginning with Kubernetes version 1.9.x, automatic firewall rules have changed such that workloads in your Kubernetes Engine cluster cannot communicate with other Compute Engine VMs that are on the same network, but outside the cluster. This change was made for security reasons.
In the next link is the solution: https://cloud.google.com/kubernetes-engine/docs/troubleshooting#autofirewall
Basically:
First, find your cluster's network:
gcloud container clusters describe [CLUSTER_NAME] --format=get"(network)"
Then get the cluster's IPv4 CIDR used for the containers:
gcloud container clusters describe [CLUSTER_NAME] --format=get"(clusterIpv4Cidr)"
Finally create a firewall rule for the network, with the CIDR as the source range, and allow all protocols:
gcloud compute firewall-rules create "[CLUSTER_NAME]-to-all-vms-on-network" --network="[NETWORK]" --source-ranges="[CLUSTER_IPV4_CIDR]" --allow=tcp,udp,icmp,esp,ah,sctp
Solution 2
Since you have two different database servers in GCP, they might have different configurations. Are you using Cloud SQL or database servers installed on GCE VMs? For Cloud SLQ, make sure the external IP addresses of your cluster nodes are whitelisted on the authorized neworks of the Cloud SQL instance.If running your database on GCE VMs, I'd recommend checking firewall rules to make sure they allow incoming connections to the server on right port and protocols. You might also verify the binding address of your database process to see if it accepts incoming connections from extrnals IP addresses. (This can be done by running "sudo netstat -plnt" to see processes and their binding addresses). This link may help.
Related videos on Youtube
kurkop
Updated on September 18, 2022Comments
-
kurkop over 1 year
I have two projects in GCP:
- With Kubernetes Nodes v1.8.8-gke.0. and a database outside of Kubernetes but in the default network. All pods can connect to this server and all ports
- With Kubernetes Nodes v1.9.7-gke.3 and a database outside of Kubernetes but in the default network. No pod can connect to this server. Traceroute test fails.
Why this Pod can't connect? Ideas?
Thanks.
-
ALex_hha almost 6 yearsIs DB the same in both cases?
-
kurkop almost 6 years@ALex_hha No, DB for a project in GCP.
-
kurkop almost 6 yearsI use Mongo 3.4 with Docker. I can connect from other servers but fails with Kubernetes docker network (Inside of docker).
-
mehdi sharifi almost 6 yearsDo both databases and Kubernetes dockers (deployed for versions v1.8.8-gke.0 and v1.9.7-gke.3) have same configurations ? If the answer is yes, you can submit an issue here under 'Compute' for an Kubernetes Engine issue.