best-wall-17038
09/08/2022, 8:34 AMkubectl get pods -n comnext
I0908 10:32:01.731836 12084 versioner.go:56] Remote kubernetes server unreachable
Unable to connect to the server: net/http: TLS handshake timeout
wide-mechanic-33041
09/08/2022, 12:28 PMbest-wall-17038
09/08/2022, 2:00 PMwide-mechanic-33041
09/08/2022, 2:06 PMbest-wall-17038
09/08/2022, 2:32 PMkubectl get ns
I0908 16:33:31.033046 31140 versioner.go:56] Remote kubernetes server unreachable
wide-mechanic-33041
09/08/2022, 3:04 PMkubectl --v=9 get ns
?best-wall-17038
09/08/2022, 3:05 PM--v=9
?wide-mechanic-33041
09/08/2022, 3:05 PMbest-wall-17038
09/08/2022, 3:06 PMkubectl --v=9 get ns INT ✘ 10s 17:04:41 ▓▒░
I0908 17:05:08.891214 32288 versioner.go:56] Remote kubernetes server unreachable
I0908 17:05:09.110461 32288 loader.go:372] Config loaded from file: /Users/uralsem/.kube/config
I0908 17:05:09.130193 32288 round_trippers.go:466] curl -v -XGET -H "Accept: application/json, */*" -H "User-Agent: kubectl1.23.3/v1.23.3 (darwin/amd64) kubernetes/816c97a" '<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>'
I0908 17:05:09.136732 32288 round_trippers.go:510] HTTP Trace: Dial to tcp:127.0.0.1:6443 succeed
I0908 17:05:19.149190 32288 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 1 ms TLSHandshake 10007 ms Duration 10018 ms
I0908 17:05:19.149268 32288 round_trippers.go:577] Response Headers:
I0908 17:05:19.151381 32288 cached_discovery.go:78] skipped caching discovery info due to Get "<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>": net/http: TLS handshake timeout
I0908 17:05:19.162021 32288 round_trippers.go:466] curl -v -XGET -H "Accept: application/json, */*" -H "User-Agent: kubectl1.23.3/v1.23.3 (darwin/amd64) kubernetes/816c97a" '<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>'
I0908 17:05:19.164376 32288 round_trippers.go:510] HTTP Trace: Dial to tcp:127.0.0.1:6443 succeed
I0908 17:05:29.169252 32288 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 2 ms TLSHandshake 10004 ms Duration 10007 ms
I0908 17:05:29.169354 32288 round_trippers.go:577] Response Headers:
I0908 17:05:29.169516 32288 cached_discovery.go:78] skipped caching discovery info due to Get "<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>": net/http: TLS handshake timeout
I0908 17:05:29.170125 32288 shortcut.go:89] Error loading discovery information: unable to retrieve the complete list of server APIs: <http://metrics.k8s.io/v1beta1|metrics.k8s.io/v1beta1>: Get "<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>": net/http: TLS handshake timeout
I0908 17:05:29.187323 32288 round_trippers.go:466] curl -v -XGET -H "Accept: application/json, */*" -H "User-Agent: kubectl1.23.3/v1.23.3 (darwin/amd64) kubernetes/816c97a" '<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>'
I0908 17:05:29.192199 32288 round_trippers.go:510] HTTP Trace: Dial to tcp:127.0.0.1:6443 succeed
I0908 17:05:39.193968 32288 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 1 ms TLSHandshake 10001 ms Duration 10006 ms
I0908 17:05:39.194086 32288 round_trippers.go:577] Response Headers:
I0908 17:05:39.194288 32288 cached_discovery.go:78] skipped caching discovery info due to Get "<https://127.0.0.1:6443/apis/metrics.k8s.io/v1beta1?timeout=32s>": net/http: TLS handshake timeout
I0908 17:05:39.206093 32288 round_trippers.go:466] curl -v -XGET -H "Accept: application/json;as=Table;v=v1;g=<http://meta.k8s.io|meta.k8s.io>,application/json;as=Table;v=v1beta1;g=<http://meta.k8s.io|meta.k8s.io>,application/json" -H "User-Agent: kubectl1.23.3/v1.23.3 (darwin/amd64) kubernetes/816c97a" '<https://127.0.0.1:6443/api/v1/namespaces?limit=500>'
I0908 17:05:39.209387 32288 round_trippers.go:510] HTTP Trace: Dial to tcp:127.0.0.1:6443 succeed
I0908 17:05:49.211215 32288 round_trippers.go:570] HTTP Statistics: DNSLookup 0 ms Dial 1 ms TLSHandshake 10001 ms Duration 10003 ms
I0908 17:05:49.211260 32288 round_trippers.go:577] Response Headers:
I0908 17:05:49.214209 32288 helpers.go:237] Connection error: Get <https://127.0.0.1:6443/api/v1/namespaces?limit=500>: net/http: TLS handshake timeout
F0908 17:05:49.214775 32288 helpers.go:118] Unable to connect to the server: net/http: TLS handshake timeout
goroutine 1 [running]:
<http://k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0x1)|k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0x1)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1038 +0x8a
<http://k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x3c83d20|k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x3c83d20>, 0x3, 0x0, 0xc0009a0070, 0x2, {0x31f383e, 0x10}, 0xc000188480, 0x0)
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:987 +0x5fd
<http://k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0xc0000401e0|k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0xc0000401e0>, 0x41, 0x0, {0x0, 0x0}, 0x0, {0xc000436dd0, 0x1, 0x1})
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:735 +0x1ae
<http://k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)|k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1518
<http://k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal({0xc0000401e0|k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal({0xc0000401e0>, 0x41}, 0xc000436d20)
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:96 +0xc5
<http://k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr({0x2bed1e0|k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr({0x2bed1e0>, 0xc0006d4ae0}, 0x2a788e8)
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:180 +0x69a
<http://k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)|k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:118
<http://k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get.NewCmdGet.func2(0xc0003a1680|k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get.NewCmdGet.func2(0xc0003a1680>, {0xc000519540, 0x1, 0x2})
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get/get.go:181 +0xc8
<http://k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc0003a1680|k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc0003a1680>, {0xc000519520, 0x2, 0x2})
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:860 +0x5f8
<http://k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc0009b4500)|k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc0009b4500)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:974 +0x3bc
<http://k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)|k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:902
<http://k8s.io/kubernetes/vendor/k8s.io/component-base/cli.run(0xc0009b4500)|k8s.io/kubernetes/vendor/k8s.io/component-base/cli.run(0xc0009b4500)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:146 +0x325
<http://k8s.io/kubernetes/vendor/k8s.io/component-base/cli.RunNoErrOutput(...)|k8s.io/kubernetes/vendor/k8s.io/component-base/cli.RunNoErrOutput(...)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/cli/run.go:84
main.main()
_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:30 +0x1e
goroutine 6 [chan receive]:
<http://k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x0)|k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x0)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1181 +0x6a
created by <http://k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0|k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:420 +0xfb
goroutine 10 [select]:
<http://k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0|k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0>, {0x2bea760, 0xc000814000}, 0x1, 0xc000104360)
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x13b
<http://k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0|k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0>, 0x12a05f200, 0x0, 0x0, 0x0)
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x89
<http://k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)|k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(...)>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90
<http://k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0x0|k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Forever(0x0>, 0x0)
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:81 +0x28
created by <http://k8s.io/kubernetes/vendor/k8s.io/component-base/logs.InitLogs|k8s.io/kubernetes/vendor/k8s.io/component-base/logs.InitLogs>
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/component-base/logs/logs.go:179 +0x8
wide-mechanic-33041
09/08/2022, 3:07 PMbest-wall-17038
09/08/2022, 3:07 PMwide-mechanic-33041
09/08/2022, 3:09 PMbest-wall-17038
09/08/2022, 3:09 PMwide-mechanic-33041
09/08/2022, 3:10 PMbest-wall-17038
09/08/2022, 3:10 PMwide-mechanic-33041
09/08/2022, 3:12 PMrdctl shell
can you get any responsesbest-wall-17038
09/08/2022, 3:12 PMwide-mechanic-33041
09/08/2022, 3:13 PMbest-wall-17038
09/08/2022, 3:13 PMwide-mechanic-33041
09/08/2022, 3:13 PMbest-wall-17038
09/08/2022, 3:13 PMI0908 17:11:11.920375 32634 versioner.go:56] Remote kubernetes server unreachable
Starting to serve on 127.0.0.1:8001
E0908 17:13:05.925598 32634 proxy_server.go:147] Error while proxying request: net/http: TLS handshake timeout
wide-mechanic-33041
09/08/2022, 3:14 PMbest-wall-17038
09/08/2022, 3:15 PMwide-mechanic-33041
09/08/2022, 3:15 PMbest-wall-17038
09/08/2022, 3:17 PMwide-mechanic-33041
09/08/2022, 3:18 PMbest-wall-17038
09/08/2022, 3:18 PMwide-mechanic-33041
09/08/2022, 3:18 PMbest-wall-17038
09/08/2022, 3:19 PMwide-mechanic-33041
09/08/2022, 3:19 PMbest-wall-17038
09/08/2022, 3:20 PMwide-mechanic-33041
09/08/2022, 3:20 PMbest-wall-17038
09/08/2022, 3:23 PMwide-mechanic-33041
09/08/2022, 3:24 PMbest-wall-17038
09/08/2022, 3:25 PMwide-mechanic-33041
09/08/2022, 3:25 PMbest-wall-17038
09/08/2022, 3:26 PMwide-mechanic-33041
09/08/2022, 3:26 PMbest-wall-17038
09/08/2022, 3:26 PMfast-garage-66093
09/08/2022, 3:35 PMbest-wall-17038
09/08/2022, 3:35 PMfast-garage-66093
09/08/2022, 4:56 PM"$HOME/Library/Application Support/rancher-desktop/lima"
directory and then do a Factory Reset to see if that solves the issue. That way you know if the problem is with your host or the VM, and then take it from therebest-wall-17038
09/08/2022, 9:18 PMfast-garage-66093
09/08/2022, 9:46 PMDOCKER_HOST
setting, and would only affect docker
commands run internally by RD. You were showing errors from kubernetes...