Hello everyone,
I submit my problem to you, I hope you can help me.
I created a kubernetes cluster on eks through a rancher, that works (rancher works) under my local kubernetes cluster.
My local network is the classic 192.168.1.xx behind a vodafone station.
I deployed the kubernetes cluster on EKS through rancher and I have the problem that the rancher agent deployed on EKS does not connect to the rancher server.
My local rancher server has as cluster ip
rancher ClusterIP 10.100.174.213 192.168.1.210 80/TCP,443/TCP 32d
and has as external ip, as you can see, 192.168.1.210.
The problem is that the agent that runs on EKS has a destination IP of 10.100.174.213 and I don't know how to tell the agent that it must
use the IP 192.168.1.210 on port 443 which I think is the only way to make the agent on EKS communicate with my rancher locally
bacause the Cluster IP is not reachable from the external of my local cluster.
thanks a lot