Result | FAILURE |
Tests | 48 failed / 823 succeeded |
Started | |
Elapsed | 42m3s |
Revision | master |
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\sExternal\sStorage\s\[Driver\:\sebs\.csi\.aws\.com\]\s\[Testpattern\:\sDynamic\sPV\s\(default\sfs\)\]\svolumes\sshould\sstore\sdata$'
test/e2e/framework/pod/delete.go:47 k8s.io/kubernetes/test/e2e/framework/pod.DeletePodOrFail({0x7ca2818, 0xc00262d800}, {0xc002dca940, 0xb}, {0xc002dda3d8, 0x11}) test/e2e/framework/pod/delete.go:47 +0x270 k8s.io/kubernetes/test/e2e/framework/volume.runVolumeTesterPod({0x7ca2818, 0xc00262d800}, 0xc0013f9b80, {{0xc002c84c30, 0xb}, {0x7367383, 0x8}, {0x0, 0x0}, {0x0, ...}, ...}, ...) test/e2e/framework/volume/fixtures.go:523 +0x813 k8s.io/kubernetes/test/e2e/framework/volume.InjectContent(0xc001415ce0, {{0xc002c84c30, 0xb}, {0x7367383, 0x8}, {0x0, 0x0}, {0x0, 0x0, 0x0}, ...}, ...) test/e2e/framework/volume/fixtures.go:626 +0x193 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumesTestSuite).DefineTests.func3() test/e2e/storage/testsuites/volumes.go:188 +0x457from junit_01.xml
{"msg":"FAILED External Storage [Driver: ebs.csi.aws.com] [Testpattern: Dynamic PV (default fs)] volumes should store data","completed":1,"skipped":6,"failed":1,"failures":["External Storage [Driver: ebs.csi.aws.com] [Testpattern: Dynamic PV (default fs)] volumes should store data"]} [BeforeEach] [Testpattern: Dynamic PV (default fs)] volumes test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Dynamic PV (default fs)] volumes test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:41.055�[0m Jan 14 23:10:41.056: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename volume �[38;5;243m01/14/23 23:10:41.056�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:41.485�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:41.769�[0m [It] should store data test/e2e/storage/testsuites/volumes.go:161 Jan 14 23:10:42.054: INFO: Creating resource for dynamic PV Jan 14 23:10:42.054: INFO: Using claimSize:1Gi, test suite supported size:{ 1Mi}, driver(ebs.csi.aws.com) supported size:{ 1Mi} �[1mSTEP:�[0m creating a StorageClass volume-2647-e2e-scg4w9v �[38;5;243m01/14/23 23:10:42.054�[0m �[1mSTEP:�[0m creating a claim �[38;5;243m01/14/23 23:10:42.2�[0m Jan 14 23:10:42.200: INFO: Warning: Making PVC: VolumeMode specified as invalid empty string, treating as nil �[1mSTEP:�[0m starting external-injector �[38;5;243m01/14/23 23:10:42.486�[0m Jan 14 23:10:42.634: INFO: Waiting up to 5m0s for pod "external-injector" in namespace "volume-2647" to be "running" Jan 14 23:10:42.783: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 149.593537ms Jan 14 23:10:44.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 2.293131847s Jan 14 23:10:46.926: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 4.292510177s Jan 14 23:10:48.929: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 6.29475593s Jan 14 23:10:50.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 8.292959728s Jan 14 23:10:52.930: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 10.296601295s Jan 14 23:10:54.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 12.293303206s Jan 14 23:10:56.926: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 14.292538638s Jan 14 23:10:58.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 16.293641544s Jan 14 23:11:00.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 18.29361187s Jan 14 23:11:02.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 20.293629089s Jan 14 23:11:04.930: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 22.296311894s Jan 14 23:11:06.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 24.293621998s Jan 14 23:11:08.936: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 26.301853531s Jan 14 23:11:10.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 28.292951038s Jan 14 23:11:12.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 30.293484199s Jan 14 23:11:14.928: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 32.294305045s Jan 14 23:11:16.928: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 34.294375662s Jan 14 23:11:18.928: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 36.293833714s Jan 14 23:11:20.929: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 38.294905344s Jan 14 23:11:22.931: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 40.297332375s Jan 14 23:11:24.928: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 42.293848201s Jan 14 23:11:26.928: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 44.293880587s Jan 14 23:11:28.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 46.293081327s Jan 14 23:11:30.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 48.293454343s Jan 14 23:11:32.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 50.292987111s Jan 14 23:11:34.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 52.292763483s Jan 14 23:11:36.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 54.293724612s Jan 14 23:11:38.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 56.293730267s Jan 14 23:11:40.928: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 58.294222693s Jan 14 23:11:42.927: INFO: Pod "external-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 1m0.29348538s Jan 14 23:11:44.938: INFO: Encountered non-retryable error while getting pod volume-2647/external-injector: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/pods/external-injector": dial tcp 52.67.139.60:443: connect: connection refused �[1mSTEP:�[0m Deleting pod external-injector in namespace volume-2647 �[38;5;243m01/14/23 23:11:44.938�[0m Jan 14 23:11:45.089: INFO: Unexpected error occurred: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/pods/external-injector": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:45.089: FAIL: failed to delete pod external-injector in namespace volume-2647 Unexpected error: <*url.Error | 0xc002ef3560>: { Op: "Delete", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/pods/external-injector", Err: <*net.OpError | 0xc002cedb80>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00392a2a0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002e92f40>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/pods/external-injector": dial tcp 52.67.139.60:443: connect: connection refused occurred Full Stack Trace k8s.io/kubernetes/test/e2e/framework/pod.DeletePodOrFail({0x7ca2818, 0xc00262d800}, {0xc002dca940, 0xb}, {0xc002dda3d8, 0x11}) test/e2e/framework/pod/delete.go:47 +0x270 k8s.io/kubernetes/test/e2e/framework/volume.runVolumeTesterPod({0x7ca2818, 0xc00262d800}, 0xc0013f9b80, {{0xc002c84c30, 0xb}, {0x7367383, 0x8}, {0x0, 0x0}, {0x0, ...}, ...}, ...) test/e2e/framework/volume/fixtures.go:523 +0x813 k8s.io/kubernetes/test/e2e/framework/volume.InjectContent(0xc001415ce0, {{0xc002c84c30, 0xb}, {0x7367383, 0x8}, {0x0, 0x0}, {0x0, 0x0, 0x0}, ...}, ...) test/e2e/framework/volume/fixtures.go:626 +0x193 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumesTestSuite).DefineTests.func3() test/e2e/storage/testsuites/volumes.go:188 +0x457 �[1mSTEP:�[0m cleaning the environment after external �[38;5;243m01/14/23 23:11:45.09�[0m �[1mSTEP:�[0m Deleting pvc �[38;5;243m01/14/23 23:11:45.09�[0m Jan 14 23:11:45.242: INFO: Deleting PersistentVolumeClaim "ebs.csi.aws.comzj42n" �[1mSTEP:�[0m Deleting sc �[38;5;243m01/14/23 23:11:45.4�[0m Jan 14 23:11:45.554: INFO: Unexpected error: while cleaning up resource: <errors.aggregate | len:1, cap:1>: [ <errors.aggregate | len:3, cap:4>[ <*fmt.wrapError | 0xc002fd2e40>{ msg: "failed to find PVC ebs.csi.aws.comzj42n: Get \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/persistentvolumeclaims/ebs.csi.aws.comzj42n\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*url.Error | 0xc00392a7b0>{ Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/persistentvolumeclaims/ebs.csi.aws.comzj42n", Err: <*net.OpError | 0xc003910a00>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0038e2b40>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002fd2e00>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, }, <*fmt.wrapError | 0xc002e932e0>{ msg: "failed to delete PVC ebs.csi.aws.comzj42n: PVC Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/persistentvolumeclaims/ebs.csi.aws.comzj42n\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*errors.errorString | 0xc000f40690>{ s: "PVC Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/persistentvolumeclaims/ebs.csi.aws.comzj42n\": dial tcp 52.67.139.60:443: connect: connection refused", }, }, <*fmt.wrapError | 0xc0025f9460>{ msg: "failed to delete StorageClass volume-2647-e2e-scg4w9v: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/volume-2647-e2e-scg4w9v\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*url.Error | 0xc0038e3800>{ Op: "Delete", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/volume-2647-e2e-scg4w9v", Err: <*net.OpError | 0xc0038de910>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0039ac750>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0025f9420>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, }, ], ] Jan 14 23:11:45.554: FAIL: while cleaning up resource: [failed to find PVC ebs.csi.aws.comzj42n: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/persistentvolumeclaims/ebs.csi.aws.comzj42n": dial tcp 52.67.139.60:443: connect: connection refused, failed to delete PVC ebs.csi.aws.comzj42n: PVC Delete API error: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/persistentvolumeclaims/ebs.csi.aws.comzj42n": dial tcp 52.67.139.60:443: connect: connection refused, failed to delete StorageClass volume-2647-e2e-scg4w9v: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/volume-2647-e2e-scg4w9v": dial tcp 52.67.139.60:443: connect: connection refused] Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumesTestSuite).DefineTests.func2() test/e2e/storage/testsuites/volumes.go:157 +0x22e k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumesTestSuite).DefineTests.func3.1() test/e2e/storage/testsuites/volumes.go:165 +0x165 panic({0x6ea2520, 0xc003986480}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00056e700}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00317ea80, 0x373}, {0xc002d4d150?, 0x735bfcc?, 0xc002d4d170?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc00317e700, 0x35e}, {0xc003166660?, 0xc00317e380?, 0x2568967?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/vendor/github.com/onsi/gomega/internal.(*Assertion).match(0xc0039862c0, {0x7c51de8, 0xa9a6360}, 0x0, {0xc002ef35c0, 0x3, 0x3}) vendor/github.com/onsi/gomega/internal/assertion.go:105 +0x1f0 k8s.io/kubernetes/vendor/github.com/onsi/gomega/internal.(*Assertion).NotTo(0xc0039862c0, {0x7c51de8, 0xa9a6360}, {0xc002ef35c0, 0x3, 0x3}) vendor/github.com/onsi/gomega/internal/assertion.go:73 +0xb2 k8s.io/kubernetes/test/e2e/framework/pod.expectNoErrorWithOffset(0x1?, {0x7c34da0?, 0xc002ef3560?}, {0xc002ef35c0, 0x3, 0x3}) test/e2e/framework/pod/resource.go:64 +0xf3 k8s.io/kubernetes/test/e2e/framework/pod.expectNoError(...) test/e2e/framework/pod/resource.go:54 k8s.io/kubernetes/test/e2e/framework/pod.DeletePodOrFail({0x7ca2818, 0xc00262d800}, {0xc002dca940, 0xb}, {0xc002dda3d8, 0x11}) test/e2e/framework/pod/delete.go:47 +0x270 k8s.io/kubernetes/test/e2e/framework/volume.runVolumeTesterPod({0x7ca2818, 0xc00262d800}, 0xc0013f9b80, {{0xc002c84c30, 0xb}, {0x7367383, 0x8}, {0x0, 0x0}, {0x0, ...}, ...}, ...) test/e2e/framework/volume/fixtures.go:523 +0x813 k8s.io/kubernetes/test/e2e/framework/volume.InjectContent(0xc001415ce0, {{0xc002c84c30, 0xb}, {0x7367383, 0x8}, {0x0, 0x0}, {0x0, 0x0, 0x0}, ...}, ...) test/e2e/framework/volume/fixtures.go:626 +0x193 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumesTestSuite).DefineTests.func3() test/e2e/storage/testsuites/volumes.go:188 +0x457 [AfterEach] [Testpattern: Dynamic PV (default fs)] volumes test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "volume-2647". �[38;5;243m01/14/23 23:11:45.555�[0m Jan 14 23:11:45.709: INFO: Unexpected error: failed to list events in namespace "volume-2647": <*url.Error | 0xc003a441e0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/events", Err: <*net.OpError | 0xc0038dec80>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00392b380>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0025f99e0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.709: FAIL: failed to list events in namespace "volume-2647": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002d51590, {0xc002c84c30, 0xb}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc00262d800}, {0xc002c84c30, 0xb}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc001415ce0, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc001415ce0) test/e2e/framework/framework.go:435 +0x21d �[1mSTEP:�[0m Destroying namespace "volume-2647" for this suite. �[38;5;243m01/14/23 23:11:45.709�[0m Jan 14 23:11:45.863: FAIL: Couldn't delete ns: "volume-2647": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-2647", Err:(*net.OpError)(0xc0038df130)}) Full Stack Trace panic({0x6ea2520, 0xc0038b9f00}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0000249a0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0038faf00, 0xf9}, {0xc002d51048?, 0x735bfcc?, 0xc002d51068?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc00204c2d0, 0xe4}, {0xc002d510e0?, 0xc003a42480?, 0xc002d51108?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc003a441e0}, {0xc0025f9a20?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002d51590, {0xc002c84c30, 0xb}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc00262d800}, {0xc002c84c30, 0xb}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc001415ce0, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc001415ce0) test/e2e/framework/framework.go:435 +0x21d
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\sExternal\sStorage\s\[Driver\:\sebs\.csi\.aws\.com\]\s\[Testpattern\:\sGeneric\sEphemeral\-volume\s\(default\sfs\)\s\(immediate\-binding\)\]\sephemeral\sshould\screate\sread\/write\sinline\sephemeral\svolume$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED External Storage [Driver: ebs.csi.aws.com] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral should create read/write inline ephemeral volume","completed":0,"skipped":8,"failed":1,"failures":["External Storage [Driver: ebs.csi.aws.com] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral should create read/write inline ephemeral volume"]} [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.525�[0m Jan 14 23:10:07.525: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename ephemeral �[38;5;243m01/14/23 23:10:07.527�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:07.955�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.237�[0m [It] should create read/write inline ephemeral volume test/e2e/storage/testsuites/ephemeral.go:200 Jan 14 23:10:08.838: INFO: Pod inline-volume-2l5g8 has the following logs: Jan 14 23:10:09.000: INFO: Deleting pod "inline-volume-2l5g8" in namespace "ephemeral-8505" Jan 14 23:10:09.155: INFO: Wait up to 5m0s for pod "inline-volume-2l5g8" to be fully deleted Jan 14 23:10:11.448: INFO: Creating resource for dynamic PV Jan 14 23:10:11.448: INFO: Using claimSize:1Gi, test suite supported size:{ }, driver(ebs.csi.aws.com) supported size:{ } �[1mSTEP:�[0m creating a StorageClass ephemeral-8505-e2e-sckrld5 �[38;5;243m01/14/23 23:10:11.448�[0m �[1mSTEP:�[0m checking the requested inline volume exists in the pod running on node {Name: Selector:map[] Affinity:nil} �[38;5;243m01/14/23 23:10:11.592�[0m Jan 14 23:10:11.736: INFO: Waiting up to 15m0s for pod "inline-volume-tester-zmmj8" in namespace "ephemeral-8505" to be "running" Jan 14 23:10:11.878: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 141.886641ms Jan 14 23:10:14.020: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 2.284077844s Jan 14 23:10:16.025: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 4.288864891s Jan 14 23:10:18.021: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 6.284775182s Jan 14 23:10:20.023: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 8.286951223s Jan 14 23:10:22.021: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 10.284752148s Jan 14 23:10:24.022: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 12.285671272s Jan 14 23:10:26.020: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 14.284581113s Jan 14 23:10:28.021: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 16.284924242s Jan 14 23:10:30.021: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 18.28459598s Jan 14 23:10:32.020: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 20.283958892s Jan 14 23:10:34.020: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 22.284574559s Jan 14 23:10:36.021: INFO: Pod "inline-volume-tester-zmmj8": Phase="Pending", Reason="", readiness=false. Elapsed: 24.285202638s Jan 14 23:10:38.021: INFO: Pod "inline-volume-tester-zmmj8": Phase="Running", Reason="", readiness=true. Elapsed: 26.284593858s Jan 14 23:10:38.021: INFO: Pod "inline-volume-tester-zmmj8" satisfied condition "running" Jan 14 23:10:38.163: INFO: ExecWithOptions {Command:[/bin/sh -c mount | grep /mnt/test | grep rw,] Namespace:ephemeral-8505 PodName:inline-volume-tester-zmmj8 ContainerName:csi-volume-tester Stdin:<nil> CaptureStdout:true CaptureStderr:true PreserveWhitespace:false Quiet:false} Jan 14 23:10:38.163: INFO: >>> kubeConfig: /root/.kube/config Jan 14 23:10:38.164: INFO: ExecWithOptions: Clientset creation Jan 14 23:10:38.164: INFO: ExecWithOptions: execute(POST https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-8505/pods/inline-volume-tester-zmmj8/exec?command=%2Fbin%2Fsh&command=-c&command=mount+%7C+grep+%2Fmnt%2Ftest+%7C+grep+rw%2C&container=csi-volume-tester&container=csi-volume-tester&stderr=true&stdout=true) Jan 14 23:10:39.284: INFO: Pod inline-volume-tester-zmmj8 has the following logs: Jan 14 23:10:39.568: INFO: Deleting pod "inline-volume-tester-zmmj8" in namespace "ephemeral-8505" Jan 14 23:10:39.712: INFO: Wait up to 5m0s for pod "inline-volume-tester-zmmj8" to be fully deleted Jan 14 23:11:11.997: INFO: Wait up to 5m0s for pod PV pvc-29e08e92-10ba-4674-8959-c052b19e6d49 to be fully deleted Jan 14 23:11:11.997: INFO: Waiting up to 5m0s for PersistentVolume pvc-29e08e92-10ba-4674-8959-c052b19e6d49 to get deleted Jan 14 23:11:12.140: INFO: PersistentVolume pvc-29e08e92-10ba-4674-8959-c052b19e6d49 found and phase=Released (142.775489ms) Jan 14 23:11:17.284: INFO: PersistentVolume pvc-29e08e92-10ba-4674-8959-c052b19e6d49 found and phase=Released (5.286624551s) Jan 14 23:11:22.429: INFO: PersistentVolume pvc-29e08e92-10ba-4674-8959-c052b19e6d49 found and phase=Released (10.432083864s) Jan 14 23:11:27.572: INFO: PersistentVolume pvc-29e08e92-10ba-4674-8959-c052b19e6d49 was removed �[1mSTEP:�[0m Deleting sc �[38;5;243m01/14/23 23:11:27.715�[0m [AfterEach] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral test/e2e/framework/framework.go:187 Jan 14 23:11:27.879: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:11:28.174: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:30.319: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:32.318: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:34.318: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:36.318: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:38.317: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:40.318: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:42.318: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:44.340: INFO: Condition Ready of node i-05871d1e8f8f620dd is false instead of true. Reason: KubeletNotReady, message: node is shutting down Jan 14 23:11:44.340: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:46.326: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "ephemeral-8505" for this suite. �[38;5;243m01/14/23 23:11:46.326�[0m �[1mSTEP:�[0m Collecting events from namespace "ephemeral-8505". �[38;5;243m01/14/23 23:11:46.478�[0m Jan 14 23:11:46.631: INFO: Unexpected error: failed to list events in namespace "ephemeral-8505": <*url.Error | 0xc00259faa0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-8505/events", Err: <*net.OpError | 0xc001ebcd20>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc002193a70>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc000e40c40>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.631: FAIL: failed to list events in namespace "ephemeral-8505": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-8505/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002c76278, {0xc0004b6070, 0xe}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc00352a900}, {0xc0004b6070, 0xe}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc002421040}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc000655180}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00083e380, 0xd5}, {0xc002c775a8?, 0x735bfcc?, 0xc002c775d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc00352a900?}, {0xc002c77890?, 0x736a4c9?, 0x9?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc00152f340) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\sExternal\sStorage\s\[Driver\:\sebs\.csi\.aws\.com\]\s\[Testpattern\:\sGeneric\sEphemeral\-volume\s\(default\sfs\)\s\(late\-binding\)\]\sephemeral\sshould\ssupport\stwo\spods\swhich\shave\sthe\ssame\svolume\sdefinition$'
test/e2e/storage/testsuites/ephemeral.go:299 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6.1(0xc001932400) test/e2e/storage/testsuites/ephemeral.go:299 +0x20a k8s.io/kubernetes/test/e2e/storage/testsuites.EphemeralTest.TestEphemeral({{0x7ca2818, 0xc000c6fb00}, 0xc00166e900, {0xc000ed1e80, 0xe}, {0x0, 0x0}, 0xc0017480f0, {{0x0, 0x0}, ...}, ...}) test/e2e/storage/testsuites/ephemeral.go:424 +0x709 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6() test/e2e/storage/testsuites/ephemeral.go:316 +0x205from junit_01.xml
{"msg":"FAILED External Storage [Driver: ebs.csi.aws.com] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral should support two pods which have the same volume definition","completed":0,"skipped":24,"failed":1,"failures":["External Storage [Driver: ebs.csi.aws.com] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral should support two pods which have the same volume definition"]} [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.602�[0m Jan 14 23:10:07.602: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename ephemeral �[38;5;243m01/14/23 23:10:07.604�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.035�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.318�[0m [It] should support two pods which have the same volume definition test/e2e/storage/testsuites/ephemeral.go:281 Jan 14 23:10:09.051: INFO: Pod inline-volume-xlblm has the following logs: Jan 14 23:10:09.204: INFO: Deleting pod "inline-volume-xlblm" in namespace "ephemeral-7881" Jan 14 23:10:09.384: INFO: Wait up to 5m0s for pod "inline-volume-xlblm" to be fully deleted Jan 14 23:10:11.776: INFO: Creating resource for dynamic PV Jan 14 23:10:11.776: INFO: Using claimSize:1Gi, test suite supported size:{ }, driver(ebs.csi.aws.com) supported size:{ } �[1mSTEP:�[0m creating a StorageClass ephemeral-7881-e2e-scpt5x7 �[38;5;243m01/14/23 23:10:11.776�[0m �[1mSTEP:�[0m checking the requested inline volume exists in the pod running on node {Name: Selector:map[] Affinity:nil} �[38;5;243m01/14/23 23:10:11.92�[0m Jan 14 23:10:12.066: INFO: Waiting up to 15m0s for pod "inline-volume-tester-vxrvt" in namespace "ephemeral-7881" to be "running" Jan 14 23:10:12.209: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 142.912964ms Jan 14 23:10:14.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 2.286450606s Jan 14 23:10:16.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 4.286074496s Jan 14 23:10:18.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 6.286419314s Jan 14 23:10:20.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 8.286052559s Jan 14 23:10:22.353: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 10.287011346s Jan 14 23:10:24.353: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 12.286791044s Jan 14 23:10:26.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 14.285971179s Jan 14 23:10:28.355: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 16.289105319s Jan 14 23:10:30.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 18.286405628s Jan 14 23:10:32.351: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 20.285669526s Jan 14 23:10:34.355: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 22.288831118s Jan 14 23:10:36.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Pending", Reason="", readiness=false. Elapsed: 24.286389146s Jan 14 23:10:38.352: INFO: Pod "inline-volume-tester-vxrvt": Phase="Running", Reason="", readiness=true. Elapsed: 26.28641228s Jan 14 23:10:38.352: INFO: Pod "inline-volume-tester-vxrvt" satisfied condition "running" Jan 14 23:10:38.649: INFO: Waiting up to 15m0s for pod "inline-volume-tester2-2k6sw" in namespace "ephemeral-7881" to be "running" Jan 14 23:10:38.817: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 168.423262ms Jan 14 23:10:40.962: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 2.313450574s Jan 14 23:10:42.965: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 4.315588001s Jan 14 23:10:44.966: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 6.317297746s Jan 14 23:10:46.970: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 8.321037272s Jan 14 23:10:48.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 10.312069082s Jan 14 23:10:50.963: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 12.313595556s Jan 14 23:10:52.965: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 14.315598573s Jan 14 23:10:54.963: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 16.314070676s Jan 14 23:10:56.964: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 18.314565724s Jan 14 23:10:58.965: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 20.31647092s Jan 14 23:11:00.962: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 22.312740247s Jan 14 23:11:02.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 24.312269185s Jan 14 23:11:04.962: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 26.312648407s Jan 14 23:11:06.962: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 28.312625918s Jan 14 23:11:08.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 30.312369586s Jan 14 23:11:10.964: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 32.314723988s Jan 14 23:11:12.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 34.312366675s Jan 14 23:11:14.965: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 36.316006229s Jan 14 23:11:16.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 38.312324038s Jan 14 23:11:18.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 40.312031976s Jan 14 23:11:20.964: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 42.314849395s Jan 14 23:11:22.962: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 44.313069729s Jan 14 23:11:24.964: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 46.315107367s Jan 14 23:11:26.963: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 48.313793247s Jan 14 23:11:28.991: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 50.342007029s Jan 14 23:11:30.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 52.31240894s Jan 14 23:11:32.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 54.311827746s Jan 14 23:11:34.960: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 56.311350971s Jan 14 23:11:36.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 58.312527552s Jan 14 23:11:38.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 1m0.312015502s Jan 14 23:11:40.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 1m2.312040656s Jan 14 23:11:42.961: INFO: Pod "inline-volume-tester2-2k6sw": Phase="Pending", Reason="", readiness=false. Elapsed: 1m4.312515424s Jan 14 23:11:44.972: INFO: Encountered non-retryable error while getting pod ephemeral-7881/inline-volume-tester2-2k6sw: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/pods/inline-volume-tester2-2k6sw": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:44.972: INFO: Unexpected error: waiting for second pod with inline volume: <*fmt.wrapError | 0xc0004b5480>: { msg: "error while waiting for pod ephemeral-7881/inline-volume-tester2-2k6sw to be running: Get \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/pods/inline-volume-tester2-2k6sw\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*url.Error | 0xc00207a2a0>{ Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/pods/inline-volume-tester2-2k6sw", Err: <*net.OpError | 0xc001e08cd0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc001fd06c0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0004b5420>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, } Jan 14 23:11:44.972: FAIL: waiting for second pod with inline volume: error while waiting for pod ephemeral-7881/inline-volume-tester2-2k6sw to be running: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/pods/inline-volume-tester2-2k6sw": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6.1(0xc001932400) test/e2e/storage/testsuites/ephemeral.go:299 +0x20a k8s.io/kubernetes/test/e2e/storage/testsuites.EphemeralTest.TestEphemeral({{0x7ca2818, 0xc000c6fb00}, 0xc00166e900, {0xc000ed1e80, 0xe}, {0x0, 0x0}, 0xc0017480f0, {{0x0, 0x0}, ...}, ...}) test/e2e/storage/testsuites/ephemeral.go:424 +0x709 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6() test/e2e/storage/testsuites/ephemeral.go:316 +0x205 Jan 14 23:11:45.124: INFO: Error getting logs for pod inline-volume-tester-vxrvt: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/pods/inline-volume-tester-vxrvt/log": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:45.278: INFO: Unexpected error: list PVs: <*url.Error | 0xc001ddfbf0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes", Err: <*net.OpError | 0xc001da9130>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00207adb0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0016334e0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.278: FAIL: list PVs: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.StopPodAndDependents({0x7ca2818, 0xc000c6fb00}, 0xc00166e900, 0xc001932400) test/e2e/storage/testsuites/provisioning.go:997 +0x3cd k8s.io/kubernetes/test/e2e/storage/testsuites.EphemeralTest.TestEphemeral.func1() test/e2e/storage/testsuites/ephemeral.go:414 +0x30 panic({0x6ea2520, 0xc000aa2580}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0002c55e0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0033c0c00, 0x165}, {0xc00234d8c8?, 0x735bfcc?, 0xc00234d8e8?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0033e4580, 0x150}, {0xc00234d960?, 0xc0033c5540?, 0xc00234d988?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c2aa60, 0xc0004b5480}, {0xc000d81ea0?, 0xc00193ad00?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6.1(0xc001932400) test/e2e/storage/testsuites/ephemeral.go:299 +0x20a k8s.io/kubernetes/test/e2e/storage/testsuites.EphemeralTest.TestEphemeral({{0x7ca2818, 0xc000c6fb00}, 0xc00166e900, {0xc000ed1e80, 0xe}, {0x0, 0x0}, 0xc0017480f0, {{0x0, 0x0}, ...}, ...}) test/e2e/storage/testsuites/ephemeral.go:424 +0x709 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6() test/e2e/storage/testsuites/ephemeral.go:316 +0x205 �[1mSTEP:�[0m Deleting sc �[38;5;243m01/14/23 23:11:45.278�[0m Jan 14 23:11:45.432: INFO: Unexpected error: while cleaning up: <errors.aggregate | len:1, cap:1>: [ <errors.aggregate | len:1, cap:1>[ <*fmt.wrapError | 0xc000dcac60>{ msg: "failed to delete StorageClass ephemeral-7881-e2e-scpt5x7: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/ephemeral-7881-e2e-scpt5x7\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*url.Error | 0xc001fd1170>{ Op: "Delete", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/ephemeral-7881-e2e-scpt5x7", Err: <*net.OpError | 0xc001d4cd70>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc001fd1140>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc000dcac20>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, }, ], ] Jan 14 23:11:45.432: FAIL: while cleaning up: failed to delete StorageClass ephemeral-7881-e2e-scpt5x7: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/ephemeral-7881-e2e-scpt5x7": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func2() test/e2e/storage/testsuites/ephemeral.go:176 +0x1ba panic({0x6ea2520, 0xc00101ec40}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0002cca10}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc000f98dd0, 0xc5}, {0xc00234cae8?, 0x735bfcc?, 0xc00234cb08?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc000f9d600, 0xb0}, {0xc00234cb80?, 0xc000f9d550?, 0xc00234cba8?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc001ddfbf0}, {0xc000d9f4d0?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage/testsuites.StopPodAndDependents({0x7ca2818, 0xc000c6fb00}, 0xc00166e900, 0xc001932400) test/e2e/storage/testsuites/provisioning.go:997 +0x3cd k8s.io/kubernetes/test/e2e/storage/testsuites.EphemeralTest.TestEphemeral.func1() test/e2e/storage/testsuites/ephemeral.go:414 +0x30 panic({0x6ea2520, 0xc000aa2580}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0002c55e0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0033c0c00, 0x165}, {0xc00234d8c8?, 0x735bfcc?, 0xc00234d8e8?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0033e4580, 0x150}, {0xc00234d960?, 0xc0033c5540?, 0xc00234d988?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c2aa60, 0xc0004b5480}, {0xc000d81ea0?, 0xc00193ad00?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6.1(0xc001932400) test/e2e/storage/testsuites/ephemeral.go:299 +0x20a k8s.io/kubernetes/test/e2e/storage/testsuites.EphemeralTest.TestEphemeral({{0x7ca2818, 0xc000c6fb00}, 0xc00166e900, {0xc000ed1e80, 0xe}, {0x0, 0x0}, 0xc0017480f0, {{0x0, 0x0}, ...}, ...}) test/e2e/storage/testsuites/ephemeral.go:424 +0x709 k8s.io/kubernetes/test/e2e/storage/testsuites.(*ephemeralTestSuite).DefineTests.func6() test/e2e/storage/testsuites/ephemeral.go:316 +0x205 [AfterEach] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "ephemeral-7881". �[38;5;243m01/14/23 23:11:45.434�[0m Jan 14 23:11:45.587: INFO: Unexpected error: failed to list events in namespace "ephemeral-7881": <*url.Error | 0xc00207b680>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/events", Err: <*net.OpError | 0xc001e093b0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00207b650>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0004b5dc0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.587: FAIL: failed to list events in namespace "ephemeral-7881": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc00359f590, {0xc000ed1e80, 0xe}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc000c6fb00}, {0xc000ed1e80, 0xe}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc001687600, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc001687600) test/e2e/framework/framework.go:435 +0x21d �[1mSTEP:�[0m Destroying namespace "ephemeral-7881" for this suite. �[38;5;243m01/14/23 23:11:45.587�[0m Jan 14 23:11:45.741: FAIL: Couldn't delete ns: "ephemeral-7881": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/ephemeral-7881", Err:(*net.OpError)(0xc001da9630)}) Full Stack Trace panic({0x6ea2520, 0xc0011a8940}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0000481c0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00010dd00, 0xff}, {0xc00359f048?, 0x735bfcc?, 0xc00359f068?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0033c3a40, 0xea}, {0xc00359f0e0?, 0xc002145140?, 0xc00359f108?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc00207b680}, {0xc0004b5f20?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc00359f590, {0xc000ed1e80, 0xe}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc000c6fb00}, {0xc000ed1e80, 0xe}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc001687600, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc001687600) test/e2e/framework/framework.go:435 +0x21d
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sAdmissionWebhook\s\[Privileged\:ClusterAdmin\]\sshould\smutate\scustom\sresource\swith\spruning\s\[Conformance\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should mutate custom resource with pruning [Conformance]","completed":3,"skipped":16,"failed":1,"failures":["[sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should mutate custom resource with pruning [Conformance]"]} [BeforeEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:40.779�[0m Jan 14 23:10:40.779: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename webhook �[38;5;243m01/14/23 23:10:40.78�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:41.206�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:41.487�[0m [BeforeEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/apimachinery/webhook.go:89 �[1mSTEP:�[0m Setting up server cert �[38;5;243m01/14/23 23:10:42.06�[0m �[1mSTEP:�[0m Create role binding to let webhook read extension-apiserver-authentication �[38;5;243m01/14/23 23:10:42.304�[0m �[1mSTEP:�[0m Deploying the webhook pod �[38;5;243m01/14/23 23:10:42.449�[0m �[1mSTEP:�[0m Wait for the deployment to be ready �[38;5;243m01/14/23 23:10:42.744�[0m Jan 14 23:10:43.169: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"sample-webhook-deployment-5d85dd8cdb\" is progressing."}}, CollisionCount:(*int32)(nil)} Jan 14 23:10:45.311: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"sample-webhook-deployment-5d85dd8cdb\" is progressing."}}, CollisionCount:(*int32)(nil)} Jan 14 23:10:47.311: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 42, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"sample-webhook-deployment-5d85dd8cdb\" is progressing."}}, CollisionCount:(*int32)(nil)} �[1mSTEP:�[0m Deploying the webhook service �[38;5;243m01/14/23 23:10:49.312�[0m �[1mSTEP:�[0m Verifying the service has paired with the endpoint �[38;5;243m01/14/23 23:10:49.483�[0m Jan 14 23:10:50.484: INFO: Waiting for amount of service:e2e-test-webhook endpoints to be 1 [It] should mutate custom resource with pruning [Conformance] test/e2e/apimachinery/webhook.go:340 Jan 14 23:10:50.627: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Registering the mutating webhook for custom resource e2e-test-webhook-8795-crds.webhook.example.com via the AdmissionRegistration API �[38;5;243m01/14/23 23:10:51.056�[0m �[1mSTEP:�[0m Creating a custom resource that should be mutated by the webhook �[38;5;243m01/14/23 23:10:51.36�[0m [AfterEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/framework/framework.go:187 Jan 14 23:10:54.082: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:54.224: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:56.374: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:58.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:00.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:02.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:04.370: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:06.469: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:08.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:10.377: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:12.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:14.373: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:16.387: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:18.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:20.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:22.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:24.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:26.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:28.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:30.370: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:32.369: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:34.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:36.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:38.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:40.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:42.368: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:44.442: INFO: Condition Ready of node i-05871d1e8f8f620dd is false instead of true. Reason: KubeletNotReady, message: node is shutting down Jan 14 23:11:44.442: INFO: Condition Ready of node i-0dcc151940a349dcc is true, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:46.377: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "webhook-8831" for this suite. �[38;5;243m01/14/23 23:11:46.377�[0m �[1mSTEP:�[0m Collecting events from namespace "webhook-8831". �[38;5;243m01/14/23 23:11:46.53�[0m Jan 14 23:11:46.681: INFO: Unexpected error: failed to list events in namespace "webhook-8831": <*url.Error | 0xc001daea50>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/webhook-8831/events", Err: <*net.OpError | 0xc0022eecd0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc001daea20>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc00119ba60>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.681: FAIL: failed to list events in namespace "webhook-8831": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/webhook-8831/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc001b96278, {0xc00213ad80, 0xc}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc00286a780}, {0xc00213ad80, 0xc}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc001eda200}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00081dab0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0001962a0, 0xd5}, {0xc001b975a8?, 0x735bfcc?, 0xc001b975d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc00286a780?}, {0xc001b97890?, 0x736560f?, 0x7?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000bcdce0) test/e2e/framework/framework.go:483 +0xb8a [AfterEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/apimachinery/webhook.go:104
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sAdmissionWebhook\s\[Privileged\:ClusterAdmin\]\sshould\smutate\spod\sand\sapply\sdefaults\safter\smutation\s\[Conformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000d851e0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should mutate pod and apply defaults after mutation [Conformance]","completed":0,"skipped":29,"failed":2,"failures":["[sig-network] DNS should provide DNS for services [Conformance]","[sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should mutate pod and apply defaults after mutation [Conformance]"]} [BeforeEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.481�[0m Jan 14 23:11:45.481: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename webhook �[38;5;243m01/14/23 23:11:45.482�[0m Jan 14 23:11:45.635: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.789: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.786: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.789: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.788: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.785: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.787: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.788: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.789: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.789: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.791: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.790: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.790: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.787: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.791: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.226: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.379: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.379: INFO: Unexpected error: <*errors.errorString | 0xc0001eb900>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.379: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000d851e0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/framework/framework.go:187 Jan 14 23:12:31.380: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.551: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace [AfterEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/apimachinery/webhook.go:104
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sAdmissionWebhook\s\[Privileged\:ClusterAdmin\]\sshould\sunconditionally\sreject\soperations\son\sfail\sclosed\swebhook\s\[Conformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0009db1e0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should unconditionally reject operations on fail closed webhook [Conformance]","completed":1,"skipped":40,"failed":2,"failures":["[sig-apps] Deployment RollingUpdateDeployment should delete old pods and create new ones [Conformance]","[sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should unconditionally reject operations on fail closed webhook [Conformance]"]} [BeforeEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.463�[0m Jan 14 23:11:45.463: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename webhook �[38;5;243m01/14/23 23:11:45.464�[0m Jan 14 23:11:45.620: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.775: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.775: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.774: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.777: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.773: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.775: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.773: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.772: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.775: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.775: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.773: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.775: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.773: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.226: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.378: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.379: INFO: Unexpected error: <*errors.errorString | 0xc0001eb900>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.379: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0009db1e0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/framework/framework.go:187 Jan 14 23:12:31.379: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.534: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace [AfterEach] [sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] test/e2e/apimachinery/webhook.go:104
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sCustomResourcePublishOpenAPI\s\[Privileged\:ClusterAdmin\]\sworks\sfor\smultiple\sCRDs\sof\ssame\sgroup\sand\sversion\sbut\sdifferent\skinds\s\[Conformance\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-api-machinery] CustomResourcePublishOpenAPI [Privileged:ClusterAdmin] works for multiple CRDs of same group and version but different kinds [Conformance]","completed":0,"skipped":24,"failed":1,"failures":["[sig-api-machinery] CustomResourcePublishOpenAPI [Privileged:ClusterAdmin] works for multiple CRDs of same group and version but different kinds [Conformance]"]} [BeforeEach] [sig-api-machinery] CustomResourcePublishOpenAPI [Privileged:ClusterAdmin] test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.604�[0m Jan 14 23:10:07.604: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename crd-publish-openapi �[38;5;243m01/14/23 23:10:07.606�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.034�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.315�[0m [It] works for multiple CRDs of same group and version but different kinds [Conformance] test/e2e/apimachinery/crd_publish_openapi.go:356 �[1mSTEP:�[0m CRs in the same group and version but different kinds (two CRDs) show up in OpenAPI documentation �[38;5;243m01/14/23 23:10:08.598�[0m Jan 14 23:10:08.599: INFO: >>> kubeConfig: /root/.kube/config Jan 14 23:10:16.073: INFO: >>> kubeConfig: /root/.kube/config [AfterEach] [sig-api-machinery] CustomResourcePublishOpenAPI [Privileged:ClusterAdmin] test/e2e/framework/framework.go:187 Jan 14 23:10:57.707: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:58.292: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:00.437: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:02.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:04.445: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:06.477: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:08.440: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:10.437: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:12.437: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:14.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:16.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:18.438: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:20.437: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:22.440: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:24.437: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:26.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:28.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:30.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:32.435: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:34.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:36.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:38.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:40.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:42.436: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:44.446: INFO: Condition Ready of node i-05871d1e8f8f620dd is false instead of true. Reason: KubeletNotReady, message: node is shutting down Jan 14 23:11:44.446: INFO: Condition Ready of node i-0dcc151940a349dcc is true, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:46.449: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "crd-publish-openapi-4659" for this suite. �[38;5;243m01/14/23 23:11:46.449�[0m �[1mSTEP:�[0m Collecting events from namespace "crd-publish-openapi-4659". �[38;5;243m01/14/23 23:11:46.602�[0m Jan 14 23:11:46.754: INFO: Unexpected error: failed to list events in namespace "crd-publish-openapi-4659": <*url.Error | 0xc002d03020>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/crd-publish-openapi-4659/events", Err: <*net.OpError | 0xc0028d1900>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00263ade0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0026b8020>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.754: FAIL: failed to list events in namespace "crd-publish-openapi-4659": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/crd-publish-openapi-4659/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc00741e278, {0xc00060f6e0, 0x18}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc000800600}, {0xc00060f6e0, 0x18}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc002574f80}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc000c71110}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0025400e0, 0xd5}, {0xc00741f5a8?, 0x735bfcc?, 0xc00741f5d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc000800600?}, {0xc00741f890?, 0x739abb7?, 0x13?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0009e3b80) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sCustomResourceValidationRules\s\[Privileged\:ClusterAdmin\]\sMUST\sfail\svalidation\sfor\screate\sof\sa\scustom\sresource\sthat\sdoes\snot\ssatisfy\sthe\sx\-kubernetes\-validations\srules$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00080fb80) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-api-machinery] CustomResourceValidationRules [Privileged:ClusterAdmin] MUST fail validation for create of a custom resource that does not satisfy the x-kubernetes-validations rules","completed":3,"skipped":18,"failed":2,"failures":["[sig-api-machinery] AdmissionWebhook [Privileged:ClusterAdmin] should mutate custom resource with pruning [Conformance]","[sig-api-machinery] CustomResourceValidationRules [Privileged:ClusterAdmin] MUST fail validation for create of a custom resource that does not satisfy the x-kubernetes-validations rules"]} [BeforeEach] [sig-api-machinery] CustomResourceValidationRules [Privileged:ClusterAdmin] test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:47.297�[0m Jan 14 23:11:47.297: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename crd-validation-expressions �[38;5;243m01/14/23 23:11:47.298�[0m Jan 14 23:11:47.455: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.605: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.606: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.608: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.607: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.610: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.608: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.609: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.612: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.607: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.610: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.607: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.609: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.610: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:30.970: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.139: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.139: INFO: Unexpected error: <*errors.errorString | 0xc0001c7850>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.139: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00080fb80) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-api-machinery] CustomResourceValidationRules [Privileged:ClusterAdmin] test/e2e/framework/framework.go:187 Jan 14 23:12:31.139: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.295: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-api\-machinery\]\sResourceQuota\sshould\screate\sa\sResourceQuota\sand\scapture\sthe\slife\sof\sa\spod\.\s\[Conformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000a8d1e0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-api-machinery] ResourceQuota should create a ResourceQuota and capture the life of a pod. [Conformance]","completed":1,"skipped":23,"failed":2,"failures":["[sig-node] Probing container should mark readiness on pods to false while pod is in progress of terminating when a pod has a readiness probe","[sig-api-machinery] ResourceQuota should create a ResourceQuota and capture the life of a pod. [Conformance]"]} [BeforeEach] [sig-api-machinery] ResourceQuota test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.716�[0m Jan 14 23:11:46.717: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename resourcequota �[38;5;243m01/14/23 23:11:46.717�[0m Jan 14 23:11:46.870: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.022: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.024: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.024: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.024: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.028: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.022: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.027: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.025: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.022: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.021: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.022: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.021: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.026: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:15.023: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.505: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.660: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.660: INFO: Unexpected error: <*errors.errorString | 0xc000111b80>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.660: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000a8d1e0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-api-machinery] ResourceQuota test/e2e/framework/framework.go:187 Jan 14 23:12:32.661: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.813: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sCronJob\sshould\sbe\sable\sto\sschedule\safter\smore\sthan\s100\smissed\sschedule$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0008ea420) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-apps] CronJob should be able to schedule after more than 100 missed schedule","completed":2,"skipped":18,"failed":2,"failures":["[sig-storage] PersistentVolumes-local [Volume type: dir-bindmounted] Two pods mounting a local volume one after the other should be able to write from pod1 and read from pod2","[sig-apps] CronJob should be able to schedule after more than 100 missed schedule"]} [BeforeEach] [sig-apps] CronJob test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.238�[0m Jan 14 23:11:46.238: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename cronjob �[38;5;243m01/14/23 23:11:46.239�[0m Jan 14 23:11:46.391: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.543: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.545: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.546: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.544: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.548: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.545: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.544: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.543: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.549: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.544: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.543: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.550: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.544: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.545: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.994: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.150: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.150: INFO: Unexpected error: <*errors.errorString | 0xc00011dc70>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.150: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0008ea420) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-apps] CronJob test/e2e/framework/framework.go:187 Jan 14 23:12:32.151: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.303: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sDeployment\sRollingUpdateDeployment\sshould\sdelete\sold\spods\sand\screate\snew\sones\s\[Conformance\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-apps] Deployment RollingUpdateDeployment should delete old pods and create new ones [Conformance]","completed":1,"skipped":11,"failed":1,"failures":["[sig-apps] Deployment RollingUpdateDeployment should delete old pods and create new ones [Conformance]"]} [BeforeEach] [sig-apps] Deployment test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:29.823�[0m Jan 14 23:10:29.823: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename deployment �[38;5;243m01/14/23 23:10:29.824�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:30.251�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:30.534�[0m [BeforeEach] [sig-apps] Deployment test/e2e/apps/deployment.go:91 [It] RollingUpdateDeployment should delete old pods and create new ones [Conformance] test/e2e/apps/deployment.go:105 Jan 14 23:10:30.817: INFO: Creating replica set "test-rolling-update-controller" (going to be adopted) Jan 14 23:10:31.104: INFO: Pod name sample-pod: Found 1 pods out of 1 �[1mSTEP:�[0m ensuring each pod is running �[38;5;243m01/14/23 23:10:31.104�[0m Jan 14 23:10:31.104: INFO: Waiting up to 5m0s for pod "test-rolling-update-controller-psxzs" in namespace "deployment-3280" to be "running" Jan 14 23:10:31.245: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Pending", Reason="", readiness=false. Elapsed: 141.674967ms Jan 14 23:10:33.388: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Pending", Reason="", readiness=false. Elapsed: 2.284653261s Jan 14 23:10:35.388: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Pending", Reason="", readiness=false. Elapsed: 4.284825294s Jan 14 23:10:37.388: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Pending", Reason="", readiness=false. Elapsed: 6.284150498s Jan 14 23:10:39.388: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Pending", Reason="", readiness=false. Elapsed: 8.284062855s Jan 14 23:10:41.394: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Pending", Reason="", readiness=false. Elapsed: 10.290008421s Jan 14 23:10:43.389: INFO: Pod "test-rolling-update-controller-psxzs": Phase="Running", Reason="", readiness=true. Elapsed: 12.285580734s Jan 14 23:10:43.389: INFO: Pod "test-rolling-update-controller-psxzs" satisfied condition "running" Jan 14 23:10:43.389: INFO: Creating deployment "test-rolling-update-deployment" Jan 14 23:10:43.534: INFO: Ensuring deployment "test-rolling-update-deployment" gets the next revision from the one the adopted replica set "test-rolling-update-controller" has Jan 14 23:10:43.831: INFO: Ensuring status for deployment "test-rolling-update-deployment" is the expected Jan 14 23:10:43.982: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:2, UpdatedReplicas:1, ReadyReplicas:1, AvailableReplicas:1, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), Reason:"MinimumReplicasAvailable", Message:"Deployment has minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"test-rolling-update-deployment-78f575d8ff\" is progressing."}}, CollisionCount:(*int32)(nil)} Jan 14 23:10:46.128: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:2, UpdatedReplicas:1, ReadyReplicas:1, AvailableReplicas:1, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), Reason:"MinimumReplicasAvailable", Message:"Deployment has minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"test-rolling-update-deployment-78f575d8ff\" is progressing."}}, CollisionCount:(*int32)(nil)} Jan 14 23:10:48.128: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:2, UpdatedReplicas:1, ReadyReplicas:1, AvailableReplicas:1, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), Reason:"MinimumReplicasAvailable", Message:"Deployment has minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 14, 23, 10, 43, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"test-rolling-update-deployment-78f575d8ff\" is progressing."}}, CollisionCount:(*int32)(nil)} Jan 14 23:10:50.126: INFO: Ensuring deployment "test-rolling-update-deployment" has one old replica set (the one it adopted) [AfterEach] [sig-apps] Deployment test/e2e/apps/deployment.go:84 Jan 14 23:10:50.555: INFO: Deployment "test-rolling-update-deployment": &Deployment{ObjectMeta:{test-rolling-update-deployment deployment-3280 d9f5b48f-8f01-4ff7-86ff-940ef1fad94a 3735 1 2023-01-14 23:10:43 +0000 UTC <nil> <nil> map[name:sample-pod] map[deployment.kubernetes.io/revision:3546343826724305833] [] [] [{e2e.test Update apps/v1 2023-01-14 23:10:43 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:progressDeadlineSeconds":{},"f:replicas":{},"f:revisionHistoryLimit":{},"f:selector":{},"f:strategy":{"f:rollingUpdate":{".":{},"f:maxSurge":{},"f:maxUnavailable":{}},"f:type":{}},"f:template":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"agnhost\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{},"f:securityContext":{},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}} } {kube-controller-manager Update apps/v1 2023-01-14 23:10:49 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:deployment.kubernetes.io/revision":{}}},"f:status":{"f:availableReplicas":{},"f:conditions":{".":{},"k:{\"type\":\"Available\"}":{".":{},"f:lastTransitionTime":{},"f:lastUpdateTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}},"k:{\"type\":\"Progressing\"}":{".":{},"f:lastTransitionTime":{},"f:lastUpdateTime":{},"f:message":{},"f:reason":{},"f:status":{},"f:type":{}}},"f:observedGeneration":{},"f:readyReplicas":{},"f:replicas":{},"f:updatedReplicas":{}}} status}]},Spec:DeploymentSpec{Replicas:*1,Selector:&v1.LabelSelector{MatchLabels:map[string]string{name: sample-pod,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{ 0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[name:sample-pod] map[] [] [] []} {[] [] [{agnhost registry.k8s.io/e2e-test-images/agnhost:2.40 [] [] [] [] [] {map[] map[]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,} false false false}] [] Always 0xc00194e528 <nil> ClusterFirst map[] <nil> false false false <nil> &PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} [] nil default-scheduler [] [] <nil> nil [] <nil> <nil> <nil> map[] [] <nil> nil <nil>}},Strategy:DeploymentStrategy{Type:RollingUpdate,RollingUpdate:&RollingUpdateDeployment{MaxUnavailable:25%!,(MISSING)MaxSurge:25%!,(MISSING)},},MinReadySeconds:0,RevisionHistoryLimit:*10,Paused:false,ProgressDeadlineSeconds:*600,},Status:DeploymentStatus{ObservedGeneration:1,Replicas:1,UpdatedReplicas:1,AvailableReplicas:1,UnavailableReplicas:0,Conditions:[]DeploymentCondition{DeploymentCondition{Type:Available,Status:True,Reason:MinimumReplicasAvailable,Message:Deployment has minimum availability.,LastUpdateTime:2023-01-14 23:10:43 +0000 UTC,LastTransitionTime:2023-01-14 23:10:43 +0000 UTC,},DeploymentCondition{Type:Progressing,Status:True,Reason:NewReplicaSetAvailable,Message:ReplicaSet "test-rolling-update-deployment-78f575d8ff" has successfully progressed.,LastUpdateTime:2023-01-14 23:10:49 +0000 UTC,LastTransitionTime:2023-01-14 23:10:43 +0000 UTC,},},ReadyReplicas:1,CollisionCount:nil,},} Jan 14 23:10:50.698: INFO: New ReplicaSet "test-rolling-update-deployment-78f575d8ff" of Deployment "test-rolling-update-deployment": &ReplicaSet{ObjectMeta:{test-rolling-update-deployment-78f575d8ff deployment-3280 cb9dd437-0222-45ae-a474-9f4a063b36f2 3725 1 2023-01-14 23:10:43 +0000 UTC <nil> <nil> map[name:sample-pod pod-template-hash:78f575d8ff] map[deployment.kubernetes.io/desired-replicas:1 deployment.kubernetes.io/max-replicas:2 deployment.kubernetes.io/revision:3546343826724305833] [{apps/v1 Deployment test-rolling-update-deployment d9f5b48f-8f01-4ff7-86ff-940ef1fad94a 0xc00194e9f7 0xc00194e9f8}] [] [{kube-controller-manager Update apps/v1 2023-01-14 23:10:43 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:deployment.kubernetes.io/desired-replicas":{},"f:deployment.kubernetes.io/max-replicas":{},"f:deployment.kubernetes.io/revision":{}},"f:labels":{".":{},"f:name":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"d9f5b48f-8f01-4ff7-86ff-940ef1fad94a\"}":{}}},"f:spec":{"f:replicas":{},"f:selector":{},"f:template":{"f:metadata":{"f:labels":{".":{},"f:name":{},"f:pod-template-hash":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"agnhost\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{},"f:securityContext":{},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}} } {kube-controller-manager Update apps/v1 2023-01-14 23:10:49 +0000 UTC FieldsV1 {"f:status":{"f:availableReplicas":{},"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:readyReplicas":{},"f:replicas":{}}} status}]},Spec:ReplicaSetSpec{Replicas:*1,Selector:&v1.LabelSelector{MatchLabels:map[string]string{name: sample-pod,pod-template-hash: 78f575d8ff,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{ 0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[name:sample-pod pod-template-hash:78f575d8ff] map[] [] [] []} {[] [] [{agnhost registry.k8s.io/e2e-test-images/agnhost:2.40 [] [] [] [] [] {map[] map[]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,} false false false}] [] Always 0xc00194eaa8 <nil> ClusterFirst map[] <nil> false false false <nil> &PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} [] nil default-scheduler [] [] <nil> nil [] <nil> <nil> <nil> map[] [] <nil> nil <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:1,FullyLabeledReplicas:1,ObservedGeneration:1,ReadyReplicas:1,AvailableReplicas:1,Conditions:[]ReplicaSetCondition{},},} Jan 14 23:10:50.698: INFO: All old ReplicaSets of Deployment "test-rolling-update-deployment": Jan 14 23:10:50.698: INFO: &ReplicaSet{ObjectMeta:{test-rolling-update-controller deployment-3280 f3a4fb1f-e1f2-477a-9273-9cca718fe508 3734 2 2023-01-14 23:10:30 +0000 UTC <nil> <nil> map[name:sample-pod pod:httpd] map[deployment.kubernetes.io/desired-replicas:1 deployment.kubernetes.io/max-replicas:2 deployment.kubernetes.io/revision:3546343826724305832] [{apps/v1 Deployment test-rolling-update-deployment d9f5b48f-8f01-4ff7-86ff-940ef1fad94a 0xc00194e8cf 0xc00194e8e0}] [] [{e2e.test Update apps/v1 2023-01-14 23:10:30 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:deployment.kubernetes.io/revision":{}},"f:labels":{".":{},"f:name":{},"f:pod":{}}},"f:spec":{"f:selector":{},"f:template":{"f:metadata":{"f:labels":{".":{},"f:name":{},"f:pod":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"httpd\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}} } {kube-controller-manager Update apps/v1 2023-01-14 23:10:49 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:deployment.kubernetes.io/desired-replicas":{},"f:deployment.kubernetes.io/max-replicas":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"d9f5b48f-8f01-4ff7-86ff-940ef1fad94a\"}":{}}},"f:spec":{"f:replicas":{}}} } {kube-controller-manager Update apps/v1 2023-01-14 23:10:49 +0000 UTC FieldsV1 {"f:status":{"f:observedGeneration":{},"f:replicas":{}}} status}]},Spec:ReplicaSetSpec{Replicas:*0,Selector:&v1.LabelSelector{MatchLabels:map[string]string{name: sample-pod,pod: httpd,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{ 0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[name:sample-pod pod:httpd] map[] [] [] []} {[] [] [{httpd registry.k8s.io/e2e-test-images/httpd:2.4.38-2 [] [] [] [] [] {map[] map[]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc00194e998 <nil> ClusterFirst map[] <nil> false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} [] nil default-scheduler [] [] <nil> nil [] <nil> <nil> <nil> map[] [] <nil> nil <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:0,FullyLabeledReplicas:0,ObservedGeneration:2,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},} Jan 14 23:10:50.841: INFO: Pod "test-rolling-update-deployment-78f575d8ff-4wnbk" is available: &Pod{ObjectMeta:{test-rolling-update-deployment-78f575d8ff-4wnbk test-rolling-update-deployment-78f575d8ff- deployment-3280 ed16b50d-379a-40f5-8bb6-f662a1473c5f 3724 0 2023-01-14 23:10:43 +0000 UTC <nil> <nil> map[name:sample-pod pod-template-hash:78f575d8ff] map[] [{apps/v1 ReplicaSet test-rolling-update-deployment-78f575d8ff cb9dd437-0222-45ae-a474-9f4a063b36f2 0xc00194eee7 0xc00194eee8}] [] [{kube-controller-manager Update v1 2023-01-14 23:10:43 +0000 UTC FieldsV1 {"f:metadata":{"f:generateName":{},"f:labels":{".":{},"f:name":{},"f:pod-template-hash":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"cb9dd437-0222-45ae-a474-9f4a063b36f2\"}":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"agnhost\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{},"f:securityContext":{},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}} } {kubelet Update v1 2023-01-14 23:10:49 +0000 UTC FieldsV1 {"f:status":{"f:conditions":{"k:{\"type\":\"ContainersReady\"}":{".":{},"f:lastProbeTime":{},"f:lastTransitionTime":{},"f:status":{},"f:type":{}},"k:{\"type\":\"Initialized\"}":{".":{},"f:lastProbeTime":{},"f:lastTransitionTime":{},"f:status":{},"f:type":{}},"k:{\"type\":\"Ready\"}":{".":{},"f:lastProbeTime":{},"f:lastTransitionTime":{},"f:status":{},"f:type":{}}},"f:containerStatuses":{},"f:hostIP":{},"f:phase":{},"f:podIP":{},"f:podIPs":{".":{},"k:{\"ip\":\"100.96.1.142\"}":{".":{},"f:ip":{}}},"f:startTime":{}}} status}]},Spec:PodSpec{Volumes:[]Volume{Volume{Name:kube-api-access-p2njv,VolumeSource:VolumeSource{HostPath:nil,EmptyDir:nil,GCEPersistentDisk:nil,AWSElasticBlockStore:nil,GitRepo:nil,Secret:nil,NFS:nil,ISCSI:nil,Glusterfs:nil,PersistentVolumeClaim:nil,RBD:nil,FlexVolume:nil,Cinder:nil,CephFS:nil,Flocker:nil,DownwardAPI:nil,FC:nil,AzureFile:nil,ConfigMap:nil,VsphereVolume:nil,Quobyte:nil,AzureDisk:nil,PhotonPersistentDisk:nil,PortworxVolume:nil,ScaleIO:nil,Projected:&ProjectedVolumeSource{Sources:[]VolumeProjection{VolumeProjection{Secret:nil,DownwardAPI:nil,ConfigMap:nil,ServiceAccountToken:&ServiceAccountTokenProjection{Audience:,ExpirationSeconds:*3607,Path:token,},},VolumeProjection{Secret:nil,DownwardAPI:nil,ConfigMap:&ConfigMapProjection{LocalObjectReference:LocalObjectReference{Name:kube-root-ca.crt,},Items:[]KeyToPath{KeyToPath{Key:ca.crt,Path:ca.crt,Mode:nil,},},Optional:nil,},ServiceAccountToken:nil,},VolumeProjection{Secret:nil,DownwardAPI:&DownwardAPIProjection{Items:[]DownwardAPIVolumeFile{DownwardAPIVolumeFile{Path:namespace,FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,Mode:nil,},},},ConfigMap:nil,ServiceAccountToken:nil,},},DefaultMode:*420,},StorageOS:nil,CSI:nil,Ephemeral:nil,},},},Containers:[]Container{Container{Name:agnhost,Image:registry.k8s.io/e2e-test-images/agnhost:2.40,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p2njv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,},},RestartPolicy:Always,TerminationGracePeriodSeconds:*0,ActiveDeadlineSeconds:nil,DNSPolicy:ClusterFirst,NodeSelector:map[string]string{},ServiceAccountName:default,DeprecatedServiceAccount:default,NodeName:i-07f0d0bc50c0f4aa8,HostNetwork:false,HostPID:false,HostIPC:false,SecurityContext:&PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,},ImagePullSecrets:[]LocalObjectReference{},Hostname:,Subdomain:,Affinity:nil,SchedulerName:default-scheduler,InitContainers:[]Container{},AutomountServiceAccountToken:nil,Tolerations:[]Toleration{Toleration{Key:node.kubernetes.io/not-ready,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},Toleration{Key:node.kubernetes.io/unreachable,Operator:Exists,Value:,Effect:NoExecute,TolerationSeconds:*300,},},HostAliases:[]HostAlias{},PriorityClassName:,Priority:*0,DNSConfig:nil,ShareProcessNamespace:nil,ReadinessGates:[]PodReadinessGate{},RuntimeClassName:nil,EnableServiceLinks:*true,PreemptionPolicy:*PreemptLowerPriority,Overhead:ResourceList{},TopologySpreadConstraints:[]TopologySpreadConstraint{},EphemeralContainers:[]EphemeralContainer{},SetHostnameAsFQDN:nil,OS:nil,HostUsers:nil,},Status:PodStatus{Phase:Running,Conditions:[]PodCondition{PodCondition{Type:Initialized,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2023-01-14 23:10:43 +0000 UTC,Reason:,Message:,},PodCondition{Type:Ready,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2023-01-14 23:10:45 +0000 UTC,Reason:,Message:,},PodCondition{Type:ContainersReady,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2023-01-14 23:10:45 +0000 UTC,Reason:,Message:,},PodCondition{Type:PodScheduled,Status:True,LastProbeTime:0001-01-01 00:00:00 +0000 UTC,LastTransitionTime:2023-01-14 23:10:43 +0000 UTC,Reason:,Message:,},},Message:,Reason:,HostIP:172.20.40.234,PodIP:100.96.1.142,StartTime:2023-01-14 23:10:43 +0000 UTC,ContainerStatuses:[]ContainerStatus{ContainerStatus{Name:agnhost,State:ContainerState{Waiting:nil,Running:&ContainerStateRunning{StartedAt:2023-01-14 23:10:44 +0000 UTC,},Terminated:nil,},LastTerminationState:ContainerState{Waiting:nil,Running:nil,Terminated:nil,},Ready:true,RestartCount:0,Image:registry.k8s.io/e2e-test-images/agnhost:2.40,ImageID:registry.k8s.io/e2e-test-images/agnhost@sha256:af7e3857d87770ddb40f5ea4f89b5a2709504ab1ee31f9ea4ab5823c045f2146,ContainerID:containerd://ca81fded211bdad52f177c0beb3cfbf36d01a6d8b547b54ede577f51bb31a41b,Started:*true,},},QOSClass:BestEffort,InitContainerStatuses:[]ContainerStatus{},NominatedNodeName:,PodIPs:[]PodIP{PodIP{IP:100.96.1.142,},},EphemeralContainerStatuses:[]ContainerStatus{},},} [AfterEach] [sig-apps] Deployment test/e2e/framework/framework.go:187 Jan 14 23:10:50.841: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:50.984: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:53.129: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:55.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:57.127: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:59.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:01.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:03.129: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:05.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:07.130: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:09.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:11.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:13.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:15.145: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:17.153: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:19.130: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:21.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:23.130: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:25.129: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:27.129: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:29.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:31.131: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:33.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:35.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:37.128: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:39.127: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:41.130: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:43.130: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:45.135: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "deployment-3280" for this suite. �[38;5;243m01/14/23 23:11:45.136�[0m �[1mSTEP:�[0m Collecting events from namespace "deployment-3280". �[38;5;243m01/14/23 23:11:45.287�[0m Jan 14 23:11:45.439: INFO: Unexpected error: failed to list events in namespace "deployment-3280": <*url.Error | 0xc0033fe540>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/deployment-3280/events", Err: <*net.OpError | 0xc00214b720>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0033fe510>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc00339b740>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.440: FAIL: failed to list events in namespace "deployment-3280": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/deployment-3280/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc0027a4278, {0xc002b0bfb0, 0xf}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0016ae180}, {0xc002b0bfb0, 0xf}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc001d462c0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00026c7e0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00176c1c0, 0xd5}, {0xc0027a55a8?, 0x735bfcc?, 0xc0027a55d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc0016ae180?}, {0xc0027a5890?, 0x736d9c0?, 0xa?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000952c60) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sJob\sshould\sfail\sto\sexceed\sbackoffLimit$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00089dce0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-apps] Job should fail to exceed backoffLimit","completed":0,"skipped":33,"failed":2,"failures":["[sig-node] Probing container should be restarted with a failing exec liveness probe that took longer than the timeout","[sig-apps] Job should fail to exceed backoffLimit"]} [BeforeEach] [sig-apps] Job test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.478�[0m Jan 14 23:11:45.479: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename job �[38;5;243m01/14/23 23:11:45.48�[0m Jan 14 23:11:45.630: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.783: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.783: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.783: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.783: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.783: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.782: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.225: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.379: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.379: INFO: Unexpected error: <*errors.errorString | 0xc00011dc60>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.379: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00089dce0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-apps] Job test/e2e/framework/framework.go:187 Jan 14 23:12:31.379: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.533: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sStatefulSet\sBasic\sStatefulSet\sfunctionality\s\[StatefulSetBasic\]\sshould\sperform\scanary\supdates\sand\sphased\srolling\supdates\sof\stemplate\smodifications\s\[Conformance\]$'
test/e2e/apps/wait.go:46 k8s.io/kubernetes/test/e2e/apps.waitForPartitionedRollingUpdate({0x7ca2818, 0xc0033d0480}, 0xc0005fcf00) test/e2e/apps/wait.go:46 +0x268 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.8() test/e2e/apps/statefulset.go:470 +0x27e5from junit_01.xml
{"msg":"FAILED [sig-apps] StatefulSet Basic StatefulSet functionality [StatefulSetBasic] should perform canary updates and phased rolling updates of template modifications [Conformance]","completed":1,"skipped":13,"failed":1,"failures":["[sig-apps] StatefulSet Basic StatefulSet functionality [StatefulSetBasic] should perform canary updates and phased rolling updates of template modifications [Conformance]"]} [BeforeEach] [sig-apps] StatefulSet test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:24.218�[0m Jan 14 23:10:24.218: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename statefulset �[38;5;243m01/14/23 23:10:24.219�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:24.648�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:24.935�[0m [BeforeEach] [sig-apps] StatefulSet test/e2e/apps/statefulset.go:96 [BeforeEach] Basic StatefulSet functionality [StatefulSetBasic] test/e2e/apps/statefulset.go:111 �[1mSTEP:�[0m Creating service test in namespace statefulset-6493 �[38;5;243m01/14/23 23:10:25.219�[0m [It] should perform canary updates and phased rolling updates of template modifications [Conformance] test/e2e/apps/statefulset.go:315 �[1mSTEP:�[0m Creating a new StatefulSet �[38;5;243m01/14/23 23:10:25.365�[0m Jan 14 23:10:25.652: INFO: Found 1 stateful pods, waiting for 3 Jan 14 23:10:35.795: INFO: Found 1 stateful pods, waiting for 3 Jan 14 23:10:45.799: INFO: Found 1 stateful pods, waiting for 3 Jan 14 23:10:55.814: INFO: Found 2 stateful pods, waiting for 3 Jan 14 23:11:05.796: INFO: Waiting for pod ss2-0 to enter Running - Ready=true, currently Running - Ready=true Jan 14 23:11:05.796: INFO: Waiting for pod ss2-1 to enter Running - Ready=true, currently Running - Ready=true Jan 14 23:11:05.796: INFO: Waiting for pod ss2-2 to enter Running - Ready=true, currently Pending - Ready=false Jan 14 23:11:15.796: INFO: Waiting for pod ss2-0 to enter Running - Ready=true, currently Running - Ready=true Jan 14 23:11:15.796: INFO: Waiting for pod ss2-1 to enter Running - Ready=true, currently Running - Ready=true Jan 14 23:11:15.796: INFO: Waiting for pod ss2-2 to enter Running - Ready=true, currently Running - Ready=true �[1mSTEP:�[0m Updating stateful set template: update image from registry.k8s.io/e2e-test-images/httpd:2.4.38-2 to registry.k8s.io/e2e-test-images/httpd:2.4.39-2 �[38;5;243m01/14/23 23:11:16.225�[0m Jan 14 23:11:16.532: INFO: Updating stateful set ss2 �[1mSTEP:�[0m Creating a new revision �[38;5;243m01/14/23 23:11:16.532�[0m �[1mSTEP:�[0m Not applying an update when the partition is greater than the number of replicas �[38;5;243m01/14/23 23:11:16.818�[0m �[1mSTEP:�[0m Performing a canary update �[38;5;243m01/14/23 23:11:16.818�[0m Jan 14 23:11:17.117: INFO: Updating stateful set ss2 Jan 14 23:11:17.404: INFO: Waiting for Pod statefulset-6493/ss2-2 to have revision ss2-5d8c6ff87d update revision ss2-6557876d87 �[1mSTEP:�[0m Restoring Pods to the correct revision when they are deleted �[38;5;243m01/14/23 23:11:27.691�[0m Jan 14 23:11:28.166: INFO: Found 2 stateful pods, waiting for 3 Jan 14 23:11:38.309: INFO: Waiting for pod ss2-0 to enter Running - Ready=true, currently Running - Ready=true Jan 14 23:11:38.309: INFO: Waiting for pod ss2-1 to enter Running - Ready=true, currently Running - Ready=true Jan 14 23:11:38.309: INFO: Waiting for pod ss2-2 to enter Running - Ready=true, currently Running - Ready=true �[1mSTEP:�[0m Performing a phased rolling update �[38;5;243m01/14/23 23:11:38.594�[0m Jan 14 23:11:38.904: INFO: Updating stateful set ss2 Jan 14 23:11:39.187: INFO: Waiting for Pod statefulset-6493/ss2-1 to have revision ss2-5d8c6ff87d update revision ss2-6557876d87 Jan 14 23:11:49.343: FAIL: Failed waiting for state update: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/apps/v1/namespaces/statefulset-6493/statefulsets/ss2": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/apps.waitForPartitionedRollingUpdate({0x7ca2818, 0xc0033d0480}, 0xc0005fcf00) test/e2e/apps/wait.go:46 +0x268 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.8() test/e2e/apps/statefulset.go:470 +0x27e5 [AfterEach] Basic StatefulSet functionality [StatefulSetBasic] test/e2e/apps/statefulset.go:122 Jan 14 23:11:49.499: INFO: Deleting all statefulset in ns statefulset-6493 Jan 14 23:11:49.650: INFO: Unexpected error: <*url.Error | 0xc0016a51a0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/apps/v1/namespaces/statefulset-6493/statefulsets", Err: <*net.OpError | 0xc0014dd180>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0017f41e0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc000e7c160>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:49.650: FAIL: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/apps/v1/namespaces/statefulset-6493/statefulsets": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework/statefulset.DeleteAllStatefulSets({0x7ca2818, 0xc0033d0480}, {0xc000e23c60, 0x10}) test/e2e/framework/statefulset/rest.go:75 +0x133 k8s.io/kubernetes/test/e2e/apps.glob..func10.2.2() test/e2e/apps/statefulset.go:127 +0x1b2 [AfterEach] [sig-apps] StatefulSet test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "statefulset-6493". �[38;5;243m01/14/23 23:11:49.651�[0m Jan 14 23:11:49.803: INFO: Unexpected error: failed to list events in namespace "statefulset-6493": <*url.Error | 0xc0016a5a40>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/statefulset-6493/events", Err: <*net.OpError | 0xc0014dd4f0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc001898c90>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc000e7c780>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:49.803: FAIL: failed to list events in namespace "statefulset-6493": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/statefulset-6493/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc003bbf590, {0xc000e23c60, 0x10}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0033d0480}, {0xc000e23c60, 0x10}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc000d8fb80, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000d8fb80) test/e2e/framework/framework.go:435 +0x21d �[1mSTEP:�[0m Destroying namespace "statefulset-6493" for this suite. �[38;5;243m01/14/23 23:11:49.803�[0m Jan 14 23:11:49.955: FAIL: Couldn't delete ns: "statefulset-6493": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/statefulset-6493": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/statefulset-6493", Err:(*net.OpError)(0xc0017011d0)}) Full Stack Trace panic({0x6ea2520, 0xc000c02f40}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0000440e0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc000932480, 0x103}, {0xc003bbf048?, 0x735bfcc?, 0xc003bbf068?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc000794780, 0xee}, {0xc003bbf0e0?, 0xc0016f7440?, 0xc003bbf108?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc0016a5a40}, {0xc000e7c7c0?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc003bbf590, {0xc000e23c60, 0x10}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0033d0480}, {0xc000e23c60, 0x10}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc000d8fb80, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000d8fb80) test/e2e/framework/framework.go:435 +0x21d
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-apps\]\sStatefulSet\sMinReadySeconds\sshould\sbe\shonored\swhen\senabled$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000d52580) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-apps] StatefulSet MinReadySeconds should be honored when enabled","completed":0,"skipped":19,"failed":2,"failures":["External Storage [Driver: ebs.csi.aws.com] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral should create read/write inline ephemeral volume","[sig-apps] StatefulSet MinReadySeconds should be honored when enabled"]} [BeforeEach] [sig-apps] StatefulSet test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.642�[0m Jan 14 23:11:46.642: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename statefulset �[38;5;243m01/14/23 23:11:46.643�[0m Jan 14 23:11:46.795: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.947: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.950: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.949: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.949: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.951: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.950: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.947: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.973: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.949: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.947: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.949: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.951: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.947: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.952: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.252: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.405: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.405: INFO: Unexpected error: <*errors.errorString | 0xc000113bc0>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.405: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000d52580) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-apps] StatefulSet test/e2e/framework/framework.go:187 Jan 14 23:12:32.405: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.558: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-cli\]\sKubectl\sclient\sSimple\spod\sshould\sreturn\scommand\sexit\scodes\srunning\sa\ssuccessful\scommand$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-cli] Kubectl client Simple pod should return command exit codes running a successful command","completed":0,"skipped":5,"failed":1,"failures":["[sig-cli] Kubectl client Simple pod should return command exit codes running a successful command"]} [BeforeEach] [sig-cli] Kubectl client test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.59�[0m Jan 14 23:10:07.590: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename kubectl �[38;5;243m01/14/23 23:10:07.591�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.023�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.305�[0m [BeforeEach] [sig-cli] Kubectl client test/e2e/kubectl/kubectl.go:272 [BeforeEach] Simple pod test/e2e/kubectl/kubectl.go:409 �[1mSTEP:�[0m creating the pod from �[38;5;243m01/14/23 23:10:08.59�[0m Jan 14 23:10:08.590: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-5439 create -f -' Jan 14 23:10:11.083: INFO: stderr: "" Jan 14 23:10:11.083: INFO: stdout: "pod/httpd created\n" Jan 14 23:10:11.083: INFO: Waiting up to 5m0s for 1 pods to be running and ready: [httpd] Jan 14 23:10:11.084: INFO: Waiting up to 5m0s for pod "httpd" in namespace "kubectl-5439" to be "running and ready" Jan 14 23:10:11.228: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 144.057785ms Jan 14 23:10:11.228: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:13.371: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.287254843s Jan 14 23:10:13.371: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:15.370: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.286924684s Jan 14 23:10:15.370: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:17.370: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.286877817s Jan 14 23:10:17.370: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:19.400: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 8.316643859s Jan 14 23:10:19.400: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:21.370: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 10.286784432s Jan 14 23:10:21.370: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:23.370: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 12.286733031s Jan 14 23:10:23.370: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:25.374: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 14.290369444s Jan 14 23:10:25.374: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:27.370: INFO: Pod "httpd": Phase="Pending", Reason="", readiness=false. Elapsed: 16.28694801s Jan 14 23:10:27.371: INFO: Error evaluating pod condition running and ready: want pod 'httpd' on 'i-07f0d0bc50c0f4aa8' to be 'Running' but was 'Pending' Jan 14 23:10:29.372: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 18.288547686s Jan 14 23:10:29.372: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:31.381: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 20.297118172s Jan 14 23:10:31.381: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:33.371: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 22.287356812s Jan 14 23:10:33.371: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:35.370: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 24.286827218s Jan 14 23:10:35.370: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:37.371: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 26.287224682s Jan 14 23:10:37.371: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:39.371: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 28.287656058s Jan 14 23:10:39.371: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:41.373: INFO: Pod "httpd": Phase="Running", Reason="", readiness=false. Elapsed: 30.289949381s Jan 14 23:10:41.374: INFO: Error evaluating pod condition running and ready: pod 'httpd' on 'i-07f0d0bc50c0f4aa8' didn't have condition {Ready True}; conditions: [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:10 +0000 UTC ContainersNotReady containers with unready status: [httpd]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:11 +0000 UTC }] Jan 14 23:10:43.370: INFO: Pod "httpd": Phase="Running", Reason="", readiness=true. Elapsed: 32.286816243s Jan 14 23:10:43.370: INFO: Pod "httpd" satisfied condition "running and ready" Jan 14 23:10:43.370: INFO: Wanted all 1 pods to be running and ready. Result: true. Pods: [httpd] [It] running a successful command test/e2e/kubectl/kubectl.go:542 Jan 14 23:10:43.370: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-5439 run -i --image=registry.k8s.io/e2e-test-images/busybox:1.29-2 --restart=Never --pod-running-timeout=2m0s success -- /bin/sh -c exit 0' Jan 14 23:10:50.835: INFO: stderr: "" Jan 14 23:10:50.835: INFO: stdout: "" [AfterEach] Simple pod test/e2e/kubectl/kubectl.go:415 �[1mSTEP:�[0m using delete to clean up resources �[38;5;243m01/14/23 23:10:50.835�[0m Jan 14 23:10:50.835: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-5439 delete --grace-period=0 --force -f -' Jan 14 23:10:51.503: INFO: stderr: "Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.\n" Jan 14 23:10:51.503: INFO: stdout: "pod \"httpd\" force deleted\n" Jan 14 23:10:51.503: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-5439 get rc,svc -l name=httpd --no-headers' Jan 14 23:10:52.181: INFO: stderr: "No resources found in kubectl-5439 namespace.\n" Jan 14 23:10:52.181: INFO: stdout: "" Jan 14 23:10:52.181: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-5439 get pods -l name=httpd -o go-template={{ range .items }}{{ if not .metadata.deletionTimestamp }}{{ .metadata.name }}{{ "\n" }}{{ end }}{{ end }}' Jan 14 23:10:52.751: INFO: stderr: "" Jan 14 23:10:52.751: INFO: stdout: "" [AfterEach] [sig-cli] Kubectl client test/e2e/framework/framework.go:187 Jan 14 23:10:52.751: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:53.036: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:55.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:57.182: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:59.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:01.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:03.184: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:05.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:07.181: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:09.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:11.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:13.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:15.184: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:17.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:19.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:21.181: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:23.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:25.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:27.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:29.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:31.180: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:33.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:35.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:37.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:39.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:41.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:43.179: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:45.191: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "kubectl-5439" for this suite. �[38;5;243m01/14/23 23:11:45.191�[0m �[1mSTEP:�[0m Collecting events from namespace "kubectl-5439". �[38;5;243m01/14/23 23:11:45.345�[0m Jan 14 23:11:45.497: INFO: Unexpected error: failed to list events in namespace "kubectl-5439": <*url.Error | 0xc002963e30>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/kubectl-5439/events", Err: <*net.OpError | 0xc0035d36d0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc003f72180>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0040a26e0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.497: FAIL: failed to list events in namespace "kubectl-5439": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/kubectl-5439/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002f12278, {0xc0001f6350, 0xc}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc000c53b00}, {0xc0001f6350, 0xc}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc003ffed40}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00035e310}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc001d11500, 0xd5}, {0xc002f135a8?, 0x735bfcc?, 0xc002f135d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc000c53b00?}, {0xc002f13890?, 0x73647b4?, 0x7?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000d39340) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-cli\]\sKubectl\sclient\sSimple\spod\sshould\ssupport\sinline\sexecution\sand\sattach$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000c0b340) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-cli] Kubectl client Simple pod should support inline execution and attach","completed":2,"skipped":42,"failed":2,"failures":["[sig-storage] PVC Protection Verify that PVC in active use by a pod is not removed immediately","[sig-cli] Kubectl client Simple pod should support inline execution and attach"]} [BeforeEach] [sig-cli] Kubectl client test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.946�[0m Jan 14 23:11:45.946: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename kubectl �[38;5;243m01/14/23 23:11:45.947�[0m Jan 14 23:11:46.098: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.251: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.251: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.255: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.253: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.252: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.258: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.254: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.252: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.251: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.254: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.252: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.251: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.250: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.251: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.739: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.891: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.891: INFO: Unexpected error: <*errors.errorString | 0xc000205c00>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.891: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000c0b340) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-cli] Kubectl client test/e2e/framework/framework.go:187 Jan 14 23:12:31.892: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.047: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sConntrack\sshould\sbe\sable\sto\spreserve\sUDP\straffic\swhen\sserver\spod\scycles\sfor\sa\sClusterIP\sservice$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000f2cdc0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-network] Conntrack should be able to preserve UDP traffic when server pod cycles for a ClusterIP service","completed":2,"skipped":27,"failed":2,"failures":["[sig-node] Security Context should support pod.Spec.SecurityContext.RunAsUser And pod.Spec.SecurityContext.RunAsGroup [LinuxOnly] [Conformance]","[sig-network] Conntrack should be able to preserve UDP traffic when server pod cycles for a ClusterIP service"]} [BeforeEach] [sig-network] Conntrack test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.241�[0m Jan 14 23:11:46.241: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename conntrack �[38;5;243m01/14/23 23:11:46.243�[0m Jan 14 23:11:46.394: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.546: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.545: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.573: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.554: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.548: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.547: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.548: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.547: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.548: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.551: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.544: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.546: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.545: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.546: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.994: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.146: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.146: INFO: Unexpected error: <*errors.errorString | 0xc0001f1820>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.146: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000f2cdc0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-network] Conntrack test/e2e/framework/framework.go:187 Jan 14 23:12:32.146: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.298: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sDNS\sshould\sprovide\sDNS\sfor\sservices\s\s\[Conformance\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-network] DNS should provide DNS for services [Conformance]","completed":0,"skipped":22,"failed":1,"failures":["[sig-network] DNS should provide DNS for services [Conformance]"]} [BeforeEach] [sig-network] DNS test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.613�[0m Jan 14 23:10:07.613: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename dns �[38;5;243m01/14/23 23:10:07.614�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.043�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.327�[0m [It] should provide DNS for services [Conformance] test/e2e/network/dns.go:137 �[1mSTEP:�[0m Creating a test headless service �[38;5;243m01/14/23 23:10:08.61�[0m �[1mSTEP:�[0m Running these commands on wheezy: for i in `seq 1 600`; do check="$$(dig +notcp +noall +answer +search dns-test-service.dns-1584.svc.cluster.local A)" && test -n "$$check" && echo OK > /results/wheezy_udp@dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +tcp +noall +answer +search dns-test-service.dns-1584.svc.cluster.local A)" && test -n "$$check" && echo OK > /results/wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +notcp +noall +answer +search _http._tcp.dns-test-service.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +tcp +noall +answer +search _http._tcp.dns-test-service.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +notcp +noall +answer +search _http._tcp.test-service-2.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/wheezy_udp@_http._tcp.test-service-2.dns-1584.svc.cluster.local;check="$$(dig +tcp +noall +answer +search _http._tcp.test-service-2.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/wheezy_tcp@_http._tcp.test-service-2.dns-1584.svc.cluster.local;check="$$(dig +notcp +noall +answer +search 203.185.70.100.in-addr.arpa. PTR)" && test -n "$$check" && echo OK > /results/100.70.185.203_udp@PTR;check="$$(dig +tcp +noall +answer +search 203.185.70.100.in-addr.arpa. PTR)" && test -n "$$check" && echo OK > /results/100.70.185.203_tcp@PTR;sleep 1; done �[38;5;243m01/14/23 23:10:08.985�[0m �[1mSTEP:�[0m Running these commands on jessie: for i in `seq 1 600`; do check="$$(dig +notcp +noall +answer +search dns-test-service.dns-1584.svc.cluster.local A)" && test -n "$$check" && echo OK > /results/jessie_udp@dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +tcp +noall +answer +search dns-test-service.dns-1584.svc.cluster.local A)" && test -n "$$check" && echo OK > /results/jessie_tcp@dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +notcp +noall +answer +search _http._tcp.dns-test-service.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +tcp +noall +answer +search _http._tcp.dns-test-service.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local;check="$$(dig +notcp +noall +answer +search _http._tcp.test-service-2.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/jessie_udp@_http._tcp.test-service-2.dns-1584.svc.cluster.local;check="$$(dig +tcp +noall +answer +search _http._tcp.test-service-2.dns-1584.svc.cluster.local SRV)" && test -n "$$check" && echo OK > /results/jessie_tcp@_http._tcp.test-service-2.dns-1584.svc.cluster.local;check="$$(dig +notcp +noall +answer +search 203.185.70.100.in-addr.arpa. PTR)" && test -n "$$check" && echo OK > /results/100.70.185.203_udp@PTR;check="$$(dig +tcp +noall +answer +search 203.185.70.100.in-addr.arpa. PTR)" && test -n "$$check" && echo OK > /results/100.70.185.203_tcp@PTR;sleep 1; done �[38;5;243m01/14/23 23:10:08.985�[0m �[1mSTEP:�[0m creating a pod to probe DNS �[38;5;243m01/14/23 23:10:08.985�[0m �[1mSTEP:�[0m submitting the pod to kubernetes �[38;5;243m01/14/23 23:10:08.985�[0m Jan 14 23:10:09.141: INFO: Waiting up to 15m0s for pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7" in namespace "dns-1584" to be "running" Jan 14 23:10:09.302: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 160.748525ms Jan 14 23:10:11.446: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 2.304276848s Jan 14 23:10:13.445: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 4.303354123s Jan 14 23:10:15.446: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 6.304378195s Jan 14 23:10:17.448: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 8.30638736s Jan 14 23:10:19.449: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 10.307398929s Jan 14 23:10:21.448: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 12.30704171s Jan 14 23:10:23.446: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 14.304805261s Jan 14 23:10:25.447: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 16.305576882s Jan 14 23:10:27.450: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 18.308376171s Jan 14 23:10:29.446: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 20.304783656s Jan 14 23:10:31.454: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Pending", Reason="", readiness=false. Elapsed: 22.312954705s Jan 14 23:10:33.446: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7": Phase="Running", Reason="", readiness=true. Elapsed: 24.304439152s Jan 14 23:10:33.446: INFO: Pod "dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7" satisfied condition "running" �[1mSTEP:�[0m retrieving the pod �[38;5;243m01/14/23 23:10:33.446�[0m �[1mSTEP:�[0m looking for the results for each expected name from probers �[38;5;243m01/14/23 23:10:33.589�[0m Jan 14 23:10:33.732: INFO: Unable to read wheezy_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:33.876: INFO: Unable to read wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:34.022: INFO: Unable to read wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:34.165: INFO: Unable to read wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:34.883: INFO: Unable to read jessie_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:35.026: INFO: Unable to read jessie_tcp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:35.169: INFO: Unable to read jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:35.312: INFO: Unable to read jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:35.885: INFO: Lookups using dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7 failed for: [wheezy_udp@dns-test-service.dns-1584.svc.cluster.local wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local jessie_udp@dns-test-service.dns-1584.svc.cluster.local jessie_tcp@dns-test-service.dns-1584.svc.cluster.local jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local] Jan 14 23:10:41.033: INFO: Unable to read wheezy_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:41.176: INFO: Unable to read wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:41.320: INFO: Unable to read wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:41.464: INFO: Unable to read wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:42.182: INFO: Unable to read jessie_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:42.325: INFO: Unable to read jessie_tcp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:42.470: INFO: Unable to read jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:42.614: INFO: Unable to read jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:43.192: INFO: Lookups using dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7 failed for: [wheezy_udp@dns-test-service.dns-1584.svc.cluster.local wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local jessie_udp@dns-test-service.dns-1584.svc.cluster.local jessie_tcp@dns-test-service.dns-1584.svc.cluster.local jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local] Jan 14 23:10:46.030: INFO: Unable to read wheezy_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:46.182: INFO: Unable to read wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:46.417: INFO: Unable to read wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:46.563: INFO: Unable to read wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:47.284: INFO: Unable to read jessie_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:47.431: INFO: Unable to read jessie_tcp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:47.576: INFO: Unable to read jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:47.739: INFO: Unable to read jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:48.365: INFO: Lookups using dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7 failed for: [wheezy_udp@dns-test-service.dns-1584.svc.cluster.local wheezy_tcp@dns-test-service.dns-1584.svc.cluster.local wheezy_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local wheezy_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local jessie_udp@dns-test-service.dns-1584.svc.cluster.local jessie_tcp@dns-test-service.dns-1584.svc.cluster.local jessie_udp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local jessie_tcp@_http._tcp.dns-test-service.dns-1584.svc.cluster.local] Jan 14 23:10:51.030: INFO: Unable to read wheezy_udp@dns-test-service.dns-1584.svc.cluster.local from pod dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7: the server could not find the requested resource (get pods dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7) Jan 14 23:10:53.372: INFO: Lookups using dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7 failed for: [wheezy_udp@dns-test-service.dns-1584.svc.cluster.local] Jan 14 23:10:58.234: INFO: DNS probes using dns-1584/dns-test-4b95a7f9-9c22-43e5-98c8-deb5cc282da7 succeeded �[1mSTEP:�[0m deleting the pod �[38;5;243m01/14/23 23:10:58.234�[0m �[1mSTEP:�[0m deleting the test service �[38;5;243m01/14/23 23:10:58.394�[0m �[1mSTEP:�[0m deleting the test headless service �[38;5;243m01/14/23 23:10:58.565�[0m [AfterEach] [sig-network] DNS test/e2e/framework/framework.go:187 Jan 14 23:10:58.726: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:59.011: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:01.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:03.163: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:05.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:07.156: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:09.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:11.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:13.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:15.166: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:17.156: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:19.160: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:21.160: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:23.160: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:25.156: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:27.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:29.157: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:31.158: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:33.156: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:35.156: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:37.155: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:39.155: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:41.156: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:43.158: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:45.164: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "dns-1584" for this suite. �[38;5;243m01/14/23 23:11:45.165�[0m �[1mSTEP:�[0m Collecting events from namespace "dns-1584". �[38;5;243m01/14/23 23:11:45.318�[0m Jan 14 23:11:45.471: INFO: Unexpected error: failed to list events in namespace "dns-1584": <*url.Error | 0xc002b58bd0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/dns-1584/events", Err: <*net.OpError | 0xc0025a2910>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc002b58ba0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002ad7320>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.472: FAIL: failed to list events in namespace "dns-1584": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/dns-1584/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc000fe0278, {0xc002f536f0, 0x8}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0001b3380}, {0xc002f536f0, 0x8}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc002a6f380}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc000837960}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00008c540, 0xd5}, {0xc000fe15a8?, 0x735bfcc?, 0xc000fe15d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc0001b3380?}, {0xc000fe1890?, 0x735d369?, 0x3?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000e08420) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sProxy\sversion\sv1\sA\sset\sof\svalid\sresponses\sare\sreturned\sfor\sboth\spod\sand\sservice\sProxyWithPath\s\[Conformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000e7be40) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-network] Proxy version v1 A set of valid responses are returned for both pod and service ProxyWithPath [Conformance]","completed":0,"skipped":29,"failed":2,"failures":["External Storage [Driver: ebs.csi.aws.com] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral should support two pods which have the same volume definition","[sig-network] Proxy version v1 A set of valid responses are returned for both pod and service ProxyWithPath [Conformance]"]} [BeforeEach] version v1 test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.749�[0m Jan 14 23:11:45.749: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename proxy �[38;5;243m01/14/23 23:11:45.75�[0m Jan 14 23:11:45.905: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.059: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.056: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.059: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.057: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.056: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.058: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.061: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.058: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.058: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.059: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.057: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.056: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.060: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.062: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.481: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.633: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.633: INFO: Unexpected error: <*errors.errorString | 0xc0002378f0>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.633: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000e7be40) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] version v1 test/e2e/framework/framework.go:187 Jan 14 23:12:31.633: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.788: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-network\]\sServices\sshould\sbe\sable\sto\sup\sand\sdown\sservices$'
test/e2e/network/service.go:1148 k8s.io/kubernetes/test/e2e/network.glob..func25.9() test/e2e/network/service.go:1148 +0x410from junit_01.xml
{"msg":"FAILED [sig-network] Services should be able to up and down services","completed":0,"skipped":19,"failed":1,"failures":["[sig-network] Services should be able to up and down services"]} [BeforeEach] [sig-network] Services test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.111�[0m Jan 14 23:10:07.111: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename services �[38;5;243m01/14/23 23:10:07.112�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:07.547�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:07.83�[0m [BeforeEach] [sig-network] Services test/e2e/network/service.go:758 [It] should be able to up and down services test/e2e/network/service.go:1132 �[1mSTEP:�[0m creating up-down-1 in namespace services-1503 �[38;5;243m01/14/23 23:10:08.113�[0m �[1mSTEP:�[0m creating service up-down-1 in namespace services-1503 �[38;5;243m01/14/23 23:10:08.113�[0m �[1mSTEP:�[0m creating replication controller up-down-1 in namespace services-1503 �[38;5;243m01/14/23 23:10:08.262�[0m I0114 23:10:08.406867 6622 runners.go:193] Created replication controller with name: up-down-1, namespace: services-1503, replica count: 3 I0114 23:10:11.558645 6622 runners.go:193] up-down-1 Pods: 3 out of 3 created, 0 running, 3 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady I0114 23:10:14.559741 6622 runners.go:193] up-down-1 Pods: 3 out of 3 created, 0 running, 3 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady I0114 23:10:17.560041 6622 runners.go:193] up-down-1 Pods: 3 out of 3 created, 2 running, 1 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady I0114 23:10:20.560547 6622 runners.go:193] up-down-1 Pods: 3 out of 3 created, 3 running, 0 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady �[1mSTEP:�[0m creating up-down-2 in namespace services-1503 �[38;5;243m01/14/23 23:10:20.702�[0m �[1mSTEP:�[0m creating service up-down-2 in namespace services-1503 �[38;5;243m01/14/23 23:10:20.702�[0m �[1mSTEP:�[0m creating replication controller up-down-2 in namespace services-1503 �[38;5;243m01/14/23 23:10:20.857�[0m I0114 23:10:21.002350 6622 runners.go:193] Created replication controller with name: up-down-2, namespace: services-1503, replica count: 3 I0114 23:10:24.152858 6622 runners.go:193] up-down-2 Pods: 3 out of 3 created, 1 running, 2 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady I0114 23:10:27.153595 6622 runners.go:193] up-down-2 Pods: 3 out of 3 created, 3 running, 0 pending, 0 waiting, 0 inactive, 0 terminating, 0 unknown, 0 runningButNotReady �[1mSTEP:�[0m verifying service up-down-1 is up �[38;5;243m01/14/23 23:10:27.295�[0m Jan 14 23:10:27.296: INFO: Creating new host exec pod Jan 14 23:10:27.442: INFO: Waiting up to 5m0s for pod "verify-service-up-host-exec-pod" in namespace "services-1503" to be "running and ready" Jan 14 23:10:27.585: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 142.662076ms Jan 14 23:10:27.585: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:29.728: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 2.285925101s Jan 14 23:10:29.728: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:31.730: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 4.288375219s Jan 14 23:10:31.730: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:33.728: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 6.286055391s Jan 14 23:10:33.728: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:35.728: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 8.286174807s Jan 14 23:10:35.728: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:37.728: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 10.285782457s Jan 14 23:10:37.728: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:39.731: INFO: Pod "verify-service-up-host-exec-pod": Phase="Pending", Reason="", readiness=false. Elapsed: 12.288781053s Jan 14 23:10:39.731: INFO: The phase of Pod verify-service-up-host-exec-pod is Pending, waiting for it to be Running (with Ready = true) Jan 14 23:10:41.728: INFO: Pod "verify-service-up-host-exec-pod": Phase="Running", Reason="", readiness=true. Elapsed: 14.286110533s Jan 14 23:10:41.728: INFO: The phase of Pod verify-service-up-host-exec-pod is Running (Ready = true) Jan 14 23:10:41.728: INFO: Pod "verify-service-up-host-exec-pod" satisfied condition "running and ready" Jan 14 23:10:41.728: INFO: Creating new exec pod Jan 14 23:10:41.877: INFO: Waiting up to 5m0s for pod "verify-service-up-exec-pod-czpsd" in namespace "services-1503" to be "running" Jan 14 23:10:42.020: INFO: Pod "verify-service-up-exec-pod-czpsd": Phase="Pending", Reason="", readiness=false. Elapsed: 143.043019ms Jan 14 23:10:44.165: INFO: Pod "verify-service-up-exec-pod-czpsd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.28837042s Jan 14 23:10:46.181: INFO: Pod "verify-service-up-exec-pod-czpsd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.304318854s Jan 14 23:10:48.178: INFO: Pod "verify-service-up-exec-pod-czpsd": Phase="Running", Reason="", readiness=true. Elapsed: 6.301350773s Jan 14 23:10:48.178: INFO: Pod "verify-service-up-exec-pod-czpsd" satisfied condition "running" �[1mSTEP:�[0m verifying service has 3 reachable backends �[38;5;243m01/14/23 23:10:48.178�[0m Jan 14 23:10:48.178: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:10:48.178: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:10:50.032: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:10:50.033: INFO: stdout: "up-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\n" Jan 14 23:10:50.033: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:10:55.033: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:10:55.033: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:10:56.760: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:10:56.760: INFO: stdout: "up-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\n" Jan 14 23:10:56.760: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:01.760: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:01.760: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:03.577: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:03.577: INFO: stdout: "up-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\n" Jan 14 23:11:03.578: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:08.578: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:08.578: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:10.676: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:10.676: INFO: stdout: "up-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\n" Jan 14 23:11:10.676: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:15.677: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:15.677: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:17.551: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:17.551: INFO: stdout: "up-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\n" Jan 14 23:11:17.551: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:22.552: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:22.552: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:24.303: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:24.303: INFO: stdout: "up-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\n" Jan 14 23:11:24.303: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:29.303: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:29.303: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:31.175: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:31.176: INFO: stdout: "up-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\n" Jan 14 23:11:31.176: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:36.177: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:36.177: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:38.083: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:38.083: INFO: stdout: "up-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\n" Jan 14 23:11:38.083: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:43.083: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:43.084: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:45.018: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:11:45.019: INFO: stdout: "up-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\n" Jan 14 23:11:45.019: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:50.019: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:50.019: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:50.233: INFO: rc: 1 Jan 14 23:11:50.233: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:11:50.233: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:11:55.234: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:11:55.234: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:11:55.450: INFO: rc: 1 Jan 14 23:11:55.450: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:11:55.450: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:00.451: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:00.451: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:00.672: INFO: rc: 1 Jan 14 23:12:00.672: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:12:00.672: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:05.672: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:05.673: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:05.888: INFO: rc: 1 Jan 14 23:12:05.888: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:12:05.888: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:10.888: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:10.888: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:11.106: INFO: rc: 1 Jan 14 23:12:11.106: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:12:11.106: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:16.106: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:16.106: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:31.741: INFO: rc: 1 Jan 14 23:12:31.741: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:12:31.741: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:36.742: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:36.742: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:36.976: INFO: rc: 1 Jan 14 23:12:36.976: INFO: error while kubectl execing "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod: error running /home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done: Command stdout: stderr: The connection to the server api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io was refused - did you specify the right host or port? error: exit status 1 Output: Jan 14 23:12:36.976: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:41.976: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:41.976: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:44.039: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:12:44.039: INFO: stdout: "up-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\n" Jan 14 23:12:44.040: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:49.040: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:49.041: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:50.905: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:12:50.905: INFO: stdout: "up-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\n" Jan 14 23:12:50.905: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:12:55.906: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:12:55.906: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:12:57.838: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:12:57.839: INFO: stdout: "up-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\nup-down-1-fcdxx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-fcdxx\nup-down-1-fcdxx\nup-down-1-86crx\n" Jan 14 23:12:57.839: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:13:02.840: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:13:02.840: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:13:28.603: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ true\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:13:28.603: INFO: stdout: "wget: download timed out\n\nup-down-1-86crx\nup-down-1-86crx\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-86crx\nwget: download timed out\n\nup-down-1-sf2rt\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-sf2rt\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nwget: download timed out\n\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nwget: download timed out\n\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nwget: download timed out\n\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-86crx\nwget: download timed out\n\nwget: download timed out\n\nup-down-1-sf2rt\nwget: download timed out\n\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\n" Jan 14 23:13:28.603: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:13:33.603: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:13:33.603: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:13:35.316: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:13:35.316: INFO: stdout: "up-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\n" Jan 14 23:13:35.316: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:13:40.317: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:13:40.317: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:13:42.755: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:13:42.755: INFO: stdout: "up-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\n" Jan 14 23:13:42.755: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:13:47.755: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:13:47.755: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:13:49.792: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:13:49.792: INFO: stdout: "up-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\n" Jan 14 23:13:49.792: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:13:54.793: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:13:54.793: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:13:56.639: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:13:56.639: INFO: stdout: "up-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\n" Jan 14 23:13:56.639: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:01.640: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:01.640: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:03.361: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:03.362: INFO: stdout: "up-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\n" Jan 14 23:14:03.362: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:08.362: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:08.362: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:10.283: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:10.283: INFO: stdout: "up-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\n" Jan 14 23:14:10.283: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:15.283: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:15.283: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:17.518: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:17.518: INFO: stdout: "up-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\n" Jan 14 23:14:17.518: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:22.518: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:22.518: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:24.425: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:24.425: INFO: stdout: "up-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\n" Jan 14 23:14:24.425: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:29.425: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:29.425: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:31.279: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:31.280: INFO: stdout: "up-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\n" Jan 14 23:14:31.280: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:36.280: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:36.280: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:37.987: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:37.987: INFO: stdout: "up-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\n" Jan 14 23:14:37.987: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:42.988: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:42.988: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:45.219: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:45.220: INFO: stdout: "up-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\n" Jan 14 23:14:45.220: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:50.220: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:50.221: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:52.287: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:52.287: INFO: stdout: "up-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\n" Jan 14 23:14:52.287: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:14:57.287: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:14:57.287: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:14:59.726: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:14:59.726: INFO: stdout: "up-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\n" Jan 14 23:14:59.726: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:04.726: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:04.726: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:06.803: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:06.803: INFO: stdout: "up-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\n" Jan 14 23:15:06.803: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:11.803: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:11.803: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:13.535: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:13.535: INFO: stdout: "up-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\n" Jan 14 23:15:13.535: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:18.536: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:18.536: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:20.635: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:20.635: INFO: stdout: "up-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\n" Jan 14 23:15:20.635: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:25.635: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:25.635: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:27.383: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:27.383: INFO: stdout: "up-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\n" Jan 14 23:15:27.383: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:32.384: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:32.384: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:34.117: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:34.117: INFO: stdout: "up-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\n" Jan 14 23:15:34.117: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:39.117: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:39.118: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:40.920: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:40.920: INFO: stdout: "up-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\n" Jan 14 23:15:40.920: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] Jan 14 23:15:45.921: INFO: Executing cmd "for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done" in pod services-1503/verify-service-up-host-exec-pod Jan 14 23:15:45.921: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=services-1503 exec verify-service-up-host-exec-pod -- /bin/sh -x -c for i in $(seq 1 150); do wget -q -O - -T 1 http://100.69.250.237:80 2>&1 || true; echo; done' Jan 14 23:15:47.647: INFO: stderr: "+ seq 1 150\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n+ wget -q -O - -T 1 http://100.69.250.237:80\n+ echo\n" Jan 14 23:15:47.647: INFO: stdout: "up-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-s45cs\nup-down-1-86crx\nup-down-1-86crx\nup-down-1-sf2rt\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-86crx\nup-down-1-s45cs\nup-down-1-sf2rt\nup-down-1-s45cs\n" Jan 14 23:15:47.647: INFO: Unable to reach the following endpoints of service 100.69.250.237: map[up-down-1-rbj4h:{}] �[1mSTEP:�[0m Deleting pod verify-service-up-host-exec-pod in namespace services-1503 �[38;5;243m01/14/23 23:15:52.647�[0m �[1mSTEP:�[0m Deleting pod verify-service-up-exec-pod-czpsd in namespace services-1503 �[38;5;243m01/14/23 23:15:53.113�[0m Jan 14 23:15:53.266: INFO: Unexpected error: <*errors.errorString | 0xc0013e02f0>: { s: "service verification failed for: 100.69.250.237\nexpected [up-down-1-86crx up-down-1-fcdxx up-down-1-rbj4h]\nreceived [up-down-1-86crx up-down-1-fcdxx up-down-1-s45cs up-down-1-sf2rt wget: download timed out]", } Jan 14 23:15:53.266: FAIL: service verification failed for: 100.69.250.237 expected [up-down-1-86crx up-down-1-fcdxx up-down-1-rbj4h] received [up-down-1-86crx up-down-1-fcdxx up-down-1-s45cs up-down-1-sf2rt wget: download timed out] Full Stack Trace k8s.io/kubernetes/test/e2e/network.glob..func25.9() test/e2e/network/service.go:1148 +0x410 [AfterEach] [sig-network] Services test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "services-1503". �[38;5;243m01/14/23 23:15:53.267�[0m �[1mSTEP:�[0m Found 74 events. �[38;5;243m01/14/23 23:15:53.552�[0m Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1: {replication-controller } SuccessfulCreate: Created pod: up-down-1-86crx Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1: {replication-controller } SuccessfulCreate: Created pod: up-down-1-rbj4h Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1: {replication-controller } SuccessfulCreate: Created pod: up-down-1-fcdxx Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1-86crx: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-1-86crx to i-066162bc3bb041a75 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1-fcdxx: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-1-fcdxx to i-0fcd1e4b56ac2b41b Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1-fcdxx: {kubelet i-0fcd1e4b56ac2b41b} Pulling: Pulling image "registry.k8s.io/e2e-test-images/agnhost:2.40" Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:08 +0000 UTC - event for up-down-1-rbj4h: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-1-rbj4h to i-0dcc151940a349dcc Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:09 +0000 UTC - event for up-down-1-86crx: {kubelet i-066162bc3bb041a75} FailedMount: MountVolume.SetUp failed for volume "kube-api-access-f4llv" : failed to sync configmap cache: timed out waiting for the condition Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:09 +0000 UTC - event for up-down-1-rbj4h: {kubelet i-0dcc151940a349dcc} Pulling: Pulling image "registry.k8s.io/e2e-test-images/agnhost:2.40" Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:11 +0000 UTC - event for up-down-1-86crx: {kubelet i-066162bc3bb041a75} Pulling: Pulling image "registry.k8s.io/e2e-test-images/agnhost:2.40" Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:15 +0000 UTC - event for up-down-1-rbj4h: {kubelet i-0dcc151940a349dcc} Started: Started container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:15 +0000 UTC - event for up-down-1-rbj4h: {kubelet i-0dcc151940a349dcc} Created: Created container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:15 +0000 UTC - event for up-down-1-rbj4h: {kubelet i-0dcc151940a349dcc} Pulled: Successfully pulled image "registry.k8s.io/e2e-test-images/agnhost:2.40" in 6.402909091s Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:16 +0000 UTC - event for up-down-1-fcdxx: {kubelet i-0fcd1e4b56ac2b41b} Started: Started container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:16 +0000 UTC - event for up-down-1-fcdxx: {kubelet i-0fcd1e4b56ac2b41b} Created: Created container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:16 +0000 UTC - event for up-down-1-fcdxx: {kubelet i-0fcd1e4b56ac2b41b} Pulled: Successfully pulled image "registry.k8s.io/e2e-test-images/agnhost:2.40" in 7.751599217s Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:19 +0000 UTC - event for up-down-1-86crx: {kubelet i-066162bc3bb041a75} Pulled: Successfully pulled image "registry.k8s.io/e2e-test-images/agnhost:2.40" in 8.626672627s Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:19 +0000 UTC - event for up-down-1-86crx: {kubelet i-066162bc3bb041a75} Created: Created container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:19 +0000 UTC - event for up-down-1-86crx: {kubelet i-066162bc3bb041a75} Started: Started container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:20 +0000 UTC - event for up-down-2: {replication-controller } SuccessfulCreate: Created pod: up-down-2-h9qb6 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:20 +0000 UTC - event for up-down-2: {replication-controller } SuccessfulCreate: Created pod: up-down-2-npbns Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:20 +0000 UTC - event for up-down-2: {replication-controller } SuccessfulCreate: Created pod: up-down-2-kb2nf Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:20 +0000 UTC - event for up-down-2-h9qb6: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-2-h9qb6 to i-066162bc3bb041a75 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:20 +0000 UTC - event for up-down-2-kb2nf: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-2-kb2nf to i-0dcc151940a349dcc Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:20 +0000 UTC - event for up-down-2-npbns: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-2-npbns to i-07f0d0bc50c0f4aa8 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:21 +0000 UTC - event for up-down-2-kb2nf: {kubelet i-0dcc151940a349dcc} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:21 +0000 UTC - event for up-down-2-kb2nf: {kubelet i-0dcc151940a349dcc} Created: Created container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:22 +0000 UTC - event for up-down-2-h9qb6: {kubelet i-066162bc3bb041a75} Started: Started container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:22 +0000 UTC - event for up-down-2-h9qb6: {kubelet i-066162bc3bb041a75} Created: Created container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:22 +0000 UTC - event for up-down-2-h9qb6: {kubelet i-066162bc3bb041a75} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:22 +0000 UTC - event for up-down-2-kb2nf: {kubelet i-0dcc151940a349dcc} Started: Started container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:23 +0000 UTC - event for up-down-2-npbns: {kubelet i-07f0d0bc50c0f4aa8} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:23 +0000 UTC - event for up-down-2-npbns: {kubelet i-07f0d0bc50c0f4aa8} Created: Created container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:23 +0000 UTC - event for up-down-2-npbns: {kubelet i-07f0d0bc50c0f4aa8} Started: Started container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:27 +0000 UTC - event for verify-service-up-host-exec-pod: {default-scheduler } Scheduled: Successfully assigned services-1503/verify-service-up-host-exec-pod to i-066162bc3bb041a75 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:29 +0000 UTC - event for verify-service-up-host-exec-pod: {kubelet i-066162bc3bb041a75} Created: Created container agnhost-container Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:29 +0000 UTC - event for verify-service-up-host-exec-pod: {kubelet i-066162bc3bb041a75} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:29 +0000 UTC - event for verify-service-up-host-exec-pod: {kubelet i-066162bc3bb041a75} Started: Started container agnhost-container Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:41 +0000 UTC - event for verify-service-up-exec-pod-czpsd: {default-scheduler } Scheduled: Successfully assigned services-1503/verify-service-up-exec-pod-czpsd to i-066162bc3bb041a75 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:43 +0000 UTC - event for verify-service-up-exec-pod-czpsd: {kubelet i-066162bc3bb041a75} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:43 +0000 UTC - event for verify-service-up-exec-pod-czpsd: {kubelet i-066162bc3bb041a75} Created: Created container agnhost-container Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:44 +0000 UTC - event for up-down-1-rbj4h: {kubelet i-0dcc151940a349dcc} Killing: Stopping container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:44 +0000 UTC - event for up-down-2-kb2nf: {kubelet i-0dcc151940a349dcc} Killing: Stopping container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:44 +0000 UTC - event for verify-service-up-exec-pod-czpsd: {kubelet i-066162bc3bb041a75} Started: Started container agnhost-container Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:46 +0000 UTC - event for up-down-2: {replication-controller } SuccessfulCreate: Created pod: up-down-2-zhpqb Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:46 +0000 UTC - event for up-down-2-zhpqb: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-2-zhpqb to i-0fcd1e4b56ac2b41b Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:47 +0000 UTC - event for up-down-2-zhpqb: {kubelet i-0fcd1e4b56ac2b41b} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:47 +0000 UTC - event for up-down-2-zhpqb: {kubelet i-0fcd1e4b56ac2b41b} Created: Created container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:47 +0000 UTC - event for up-down-2-zhpqb: {kubelet i-0fcd1e4b56ac2b41b} Started: Started container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:48 +0000 UTC - event for up-down-1: {replication-controller } SuccessfulCreate: Created pod: up-down-1-sf2rt Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:48 +0000 UTC - event for up-down-1-sf2rt: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-1-sf2rt to i-07f0d0bc50c0f4aa8 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:51 +0000 UTC - event for up-down-1-sf2rt: {kubelet i-07f0d0bc50c0f4aa8} Started: Started container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:51 +0000 UTC - event for up-down-1-sf2rt: {kubelet i-07f0d0bc50c0f4aa8} Created: Created container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:10:51 +0000 UTC - event for up-down-1-sf2rt: {kubelet i-07f0d0bc50c0f4aa8} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:12:56 +0000 UTC - event for up-down-1-fcdxx: {kubelet i-0fcd1e4b56ac2b41b} Killing: Stopping container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:12:56 +0000 UTC - event for up-down-2-zhpqb: {kubelet i-0fcd1e4b56ac2b41b} Killing: Stopping container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-1: {replication-controller } SuccessfulCreate: Created pod: up-down-1-s45cs Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-1-rbj4h: {taint-controller } TaintManagerEviction: Cancelling deletion of Pod services-1503/up-down-1-rbj4h Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-1-s45cs: {kubelet i-066162bc3bb041a75} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-1-s45cs: {kubelet i-066162bc3bb041a75} Started: Started container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-1-s45cs: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-1-s45cs to i-066162bc3bb041a75 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-1-s45cs: {kubelet i-066162bc3bb041a75} Created: Created container up-down-1 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-2: {replication-controller } SuccessfulCreate: Created pod: up-down-2-zx72p Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-2-kb2nf: {taint-controller } TaintManagerEviction: Cancelling deletion of Pod services-1503/up-down-2-kb2nf Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-2-zx72p: {default-scheduler } Scheduled: Successfully assigned services-1503/up-down-2-zx72p to i-066162bc3bb041a75 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-2-zx72p: {kubelet i-066162bc3bb041a75} Started: Started container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-2-zx72p: {kubelet i-066162bc3bb041a75} Pulled: Container image "registry.k8s.io/e2e-test-images/agnhost:2.40" already present on machine Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:27 +0000 UTC - event for up-down-2-zx72p: {kubelet i-066162bc3bb041a75} Created: Created container up-down-2 Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:37 +0000 UTC - event for up-down-1-fcdxx: {taint-controller } TaintManagerEviction: Cancelling deletion of Pod services-1503/up-down-1-fcdxx Jan 14 23:15:53.552: INFO: At 2023-01-14 23:13:37 +0000 UTC - event for up-down-2-zhpqb: {taint-controller } TaintManagerEviction: Cancelling deletion of Pod services-1503/up-down-2-zhpqb Jan 14 23:15:53.552: INFO: At 2023-01-14 23:14:37 +0000 UTC - event for up-down-1-sf2rt: {taint-controller } TaintManagerEviction: Cancelling deletion of Pod services-1503/up-down-1-sf2rt Jan 14 23:15:53.552: INFO: At 2023-01-14 23:14:37 +0000 UTC - event for up-down-2-npbns: {taint-controller } TaintManagerEviction: Cancelling deletion of Pod services-1503/up-down-2-npbns Jan 14 23:15:53.552: INFO: At 2023-01-14 23:15:53 +0000 UTC - event for verify-service-up-exec-pod-czpsd: {kubelet i-066162bc3bb041a75} Killing: Stopping container agnhost-container Jan 14 23:15:53.552: INFO: At 2023-01-14 23:15:53 +0000 UTC - event for verify-service-up-host-exec-pod: {kubelet i-066162bc3bb041a75} Killing: Stopping container agnhost-container Jan 14 23:15:53.695: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:15:53.695: INFO: up-down-1-86crx i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:08 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:08 +0000 UTC }] Jan 14 23:15:53.695: INFO: up-down-1-fcdxx i-0fcd1e4b56ac2b41b Failed [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:08 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:12:58 +0000 UTC PodFailed } {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:12:58 +0000 UTC PodFailed } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:08 +0000 UTC }] Jan 14 23:15:53.695: INFO: up-down-1-rbj4h i-0dcc151940a349dcc Failed [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:08 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:47 +0000 UTC PodFailed } {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:47 +0000 UTC PodFailed } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:08 +0000 UTC }] Jan 14 23:15:53.695: INFO: up-down-1-s45cs i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:27 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:28 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:28 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:27 +0000 UTC }] Jan 14 23:15:53.695: INFO: up-down-1-sf2rt i-07f0d0bc50c0f4aa8 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:48 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:51 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:51 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:48 +0000 UTC }] Jan 14 23:15:53.696: INFO: up-down-2-h9qb6 i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:22 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:22 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC }] Jan 14 23:15:53.696: INFO: up-down-2-kb2nf i-0dcc151940a349dcc Failed [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:47 +0000 UTC PodFailed } {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:47 +0000 UTC PodFailed } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC }] Jan 14 23:15:53.696: INFO: up-down-2-npbns i-07f0d0bc50c0f4aa8 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:24 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:24 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:20 +0000 UTC }] Jan 14 23:15:53.696: INFO: up-down-2-zhpqb i-0fcd1e4b56ac2b41b Failed [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:46 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:12:58 +0000 UTC PodFailed } {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:12:58 +0000 UTC PodFailed } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:46 +0000 UTC }] Jan 14 23:15:53.696: INFO: up-down-2-zx72p i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:27 +0000 UTC } {Ready True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:28 +0000 UTC } {ContainersReady True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:28 +0000 UTC } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:13:27 +0000 UTC }] Jan 14 23:15:53.696: INFO: Jan 14 23:15:54.585: INFO: Logging node info for node i-05871d1e8f8f620dd Jan 14 23:15:54.727: INFO: Node Info: &Node{ObjectMeta:{i-05871d1e8f8f620dd 5e99d328-38f2-45be-9028-14cad2569742 4522 0 2023-01-14 23:01:59 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:c5.large beta.kubernetes.io/os:linux failure-domain.beta.kubernetes.io/region:sa-east-1 failure-domain.beta.kubernetes.io/zone:sa-east-1a kops.k8s.io/kops-controller-pki: kubernetes.io/arch:amd64 kubernetes.io/hostname:i-05871d1e8f8f620dd kubernetes.io/os:linux node-role.kubernetes.io/control-plane: node.kubernetes.io/exclude-from-external-load-balancers: node.kubernetes.io/instance-type:c5.large topology.ebs.csi.aws.com/zone:sa-east-1a topology.kubernetes.io/region:sa-east-1 topology.kubernetes.io/zone:sa-east-1a] map[csi.volume.kubernetes.io/nodeid:{"ebs.csi.aws.com":"i-05871d1e8f8f620dd"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{kubelet Update v1 2023-01-14 23:01:59 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/os":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{}}}} } {protokube Update v1 2023-01-14 23:02:25 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:kops.k8s.io/kops-controller-pki":{},"f:node-role.kubernetes.io/control-plane":{},"f:node.kubernetes.io/exclude-from-external-load-balancers":{}}}} } {aws-cloud-controller-manager Update v1 2023-01-14 23:02:47 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:beta.kubernetes.io/instance-type":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {aws-cloud-controller-manager Update v1 2023-01-14 23:02:47 +0000 UTC FieldsV1 {"f:status":{"f:addresses":{"k:{\"type\":\"ExternalDNS\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"ExternalIP\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"Hostname\"}":{"f:address":{}},"k:{\"type\":\"InternalDNS\"}":{".":{},"f:address":{},"f:type":{}}}}} status} {kubelet Update v1 2023-01-14 23:12:52 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.ebs.csi.aws.com/zone":{}}},"f:status":{"f:allocatable":{"f:memory":{}},"f:capacity":{"f:memory":{}},"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{},"f:nodeInfo":{"f:bootID":{},"f:kernelVersion":{},"f:osImage":{}}}} status} {kube-controller-manager Update v1 2023-01-14 23:13:27 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"100.96.0.0/24\"":{}},"f:taints":{}}} }]},Spec:NodeSpec{PodCIDR:100.96.0.0/24,DoNotUseExternalID:,ProviderID:aws:///sa-east-1a/i-05871d1e8f8f620dd,Unschedulable:false,Taints:[]Taint{Taint{Key:node-role.kubernetes.io/control-plane,Value:,Effect:NoSchedule,TimeAdded:<nil>,},},ConfigSource:nil,PodCIDRs:[100.96.0.0/24],},Status:NodeStatus{Capacity:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{47441653760 0} {<nil>} 46329740Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3866066944 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{42697488314 0} {<nil>} 42697488314 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3761209344 0} {<nil>} BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2023-01-14 23:12:52 +0000 UTC,LastTransitionTime:2023-01-14 23:01:51 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2023-01-14 23:12:52 +0000 UTC,LastTransitionTime:2023-01-14 23:01:51 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2023-01-14 23:12:52 +0000 UTC,LastTransitionTime:2023-01-14 23:01:51 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2023-01-14 23:12:52 +0000 UTC,LastTransitionTime:2023-01-14 23:12:52 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:172.20.37.191,},NodeAddress{Type:ExternalIP,Address:52.67.139.60,},NodeAddress{Type:InternalDNS,Address:i-05871d1e8f8f620dd.sa-east-1.compute.internal,},NodeAddress{Type:Hostname,Address:i-05871d1e8f8f620dd.sa-east-1.compute.internal,},NodeAddress{Type:ExternalDNS,Address:ec2-52-67-139-60.sa-east-1.compute.amazonaws.com,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:ec209b16bb0f3c336388aae5fe811bcd,SystemUUID:ec209b16-bb0f-3c33-6388-aae5fe811bcd,BootID:fd5c09c8-74dd-4bcb-a074-8fb9eac7d227,KernelVersion:5.15.86-flatcar,OSImage:Flatcar Container Linux by Kinvolk 3446.1.0 (Oklo),ContainerRuntimeVersion:containerd://1.6.10,KubeletVersion:v1.25.5,KubeProxyVersion:v1.25.5,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[registry.k8s.io/etcdadm/etcd-manager@sha256:66a453db625abb268f4b3bbefc5a34a171d81e6e8796cecca54cfd71775c77c4 registry.k8s.io/etcdadm/etcd-manager:v3.0.20221209],SizeBytes:231502799,},ContainerImage{Names:[quay.io/cilium/cilium@sha256:06ce2b0a0a472e73334a7504ee5c5d8b2e2d7b72ef728ad94e564740dd505be5 quay.io/cilium/cilium:v1.12.5],SizeBytes:166719855,},ContainerImage{Names:[registry.k8s.io/kube-apiserver-amd64:v1.25.5],SizeBytes:129100243,},ContainerImage{Names:[registry.k8s.io/kube-controller-manager-amd64:v1.25.5],SizeBytes:118446393,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.25.5],SizeBytes:63291081,},ContainerImage{Names:[registry.k8s.io/kube-scheduler-amd64:v1.25.5],SizeBytes:51931448,},ContainerImage{Names:[registry.k8s.io/kops/kops-controller:1.26.0-beta.2],SizeBytes:43010675,},ContainerImage{Names:[registry.k8s.io/kops/dns-controller:1.26.0-beta.2],SizeBytes:42821711,},ContainerImage{Names:[registry.k8s.io/provider-aws/aws-ebs-csi-driver@sha256:f0c5de192d832e7c1daa6580d4a62e8fa6fc8eabc0917ae4cb7ed4d15e95b59e registry.k8s.io/provider-aws/aws-ebs-csi-driver:v1.14.1],SizeBytes:29725845,},ContainerImage{Names:[quay.io/cilium/operator@sha256:a6d24a006a6b92967ac90786b49bc1ac26e5477cf028cd1186efcfc2466484db quay.io/cilium/operator:v1.12.5],SizeBytes:26802430,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:122bfb8c1edabb3c0edd63f06523e6940d958d19b3957dc7b1d6f81e9f1f6119 registry.k8s.io/sig-storage/csi-provisioner:v3.1.0],SizeBytes:23345856,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:9ebbf9f023e7b41ccee3d52afe39a89e3ddacdbb69269d583abfc25847cfd9e4 registry.k8s.io/sig-storage/csi-resizer:v1.4.0],SizeBytes:22381475,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:8b9c313c05f54fb04f8d430896f5f5904b6cb157df261501b29adc04d2b2dc7b registry.k8s.io/sig-storage/csi-attacher:v3.4.0],SizeBytes:22085298,},ContainerImage{Names:[registry.k8s.io/provider-aws/cloud-controller-manager@sha256:85d3f1e9dacc72531445989bb10999e1e70ebc409d11be57e5baa5f031a893b0 registry.k8s.io/provider-aws/cloud-controller-manager:v1.25.1],SizeBytes:18257577,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:406f59599991916d2942d8d02f076d957ed71b541ee19f09fc01723a6e6f5932 registry.k8s.io/sig-storage/livenessprobe:v2.6.0],SizeBytes:8240918,},ContainerImage{Names:[registry.k8s.io/kops/kube-apiserver-healthcheck:1.26.0-beta.2],SizeBytes:4965797,},ContainerImage{Names:[registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db registry.k8s.io/pause:3.6],SizeBytes:301773,},},VolumesInUse:[],VolumesAttached:[]AttachedVolume{},Config:nil,},} Jan 14 23:15:54.728: INFO: Logging kubelet events for node i-05871d1e8f8f620dd Jan 14 23:15:54.874: INFO: Logging pods the kubelet thinks is on node i-05871d1e8f8f620dd Jan 14 23:15:55.173: INFO: etcd-manager-main-i-05871d1e8f8f620dd started at 2023-01-14 23:12:34 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container etcd-manager ready: true, restart count 1 Jan 14 23:15:55.173: INFO: aws-cloud-controller-manager-7vgbr started at 2023-01-14 23:02:32 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container aws-cloud-controller-manager ready: true, restart count 2 Jan 14 23:15:55.173: INFO: dns-controller-56d4f686f6-6slkg started at 2023-01-14 23:02:32 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container dns-controller ready: true, restart count 1 Jan 14 23:15:55.173: INFO: cilium-q95c2 started at 2023-01-14 23:02:32 +0000 UTC (1+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Init container clean-cilium-state ready: true, restart count 1 Jan 14 23:15:55.173: INFO: Container cilium-agent ready: true, restart count 2 Jan 14 23:15:55.173: INFO: etcd-manager-cilium-i-05871d1e8f8f620dd started at 2023-01-14 23:12:34 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container etcd-manager ready: true, restart count 1 Jan 14 23:15:55.173: INFO: etcd-manager-events-i-05871d1e8f8f620dd started at 2023-01-14 23:12:34 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container etcd-manager ready: true, restart count 1 Jan 14 23:15:55.173: INFO: kube-controller-manager-i-05871d1e8f8f620dd started at 2023-01-14 23:12:34 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container kube-controller-manager ready: true, restart count 4 Jan 14 23:15:55.173: INFO: ebs-csi-node-nn4hc started at 2023-01-14 23:02:32 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:55.173: INFO: Container ebs-plugin ready: true, restart count 1 Jan 14 23:15:55.173: INFO: Container liveness-probe ready: true, restart count 1 Jan 14 23:15:55.173: INFO: Container node-driver-registrar ready: true, restart count 1 Jan 14 23:15:55.173: INFO: kops-controller-qqwjj started at 2023-01-14 23:02:32 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container kops-controller ready: true, restart count 2 Jan 14 23:15:55.173: INFO: ebs-csi-controller-5847ff7bc-wrmhm started at 2023-01-14 23:02:32 +0000 UTC (0+5 container statuses recorded) Jan 14 23:15:55.173: INFO: Container csi-attacher ready: true, restart count 2 Jan 14 23:15:55.173: INFO: Container csi-provisioner ready: true, restart count 2 Jan 14 23:15:55.173: INFO: Container csi-resizer ready: true, restart count 1 Jan 14 23:15:55.173: INFO: Container ebs-plugin ready: true, restart count 1 Jan 14 23:15:55.173: INFO: Container liveness-probe ready: true, restart count 1 Jan 14 23:15:55.173: INFO: cilium-operator-87c56f985-98jnv started at 2023-01-14 23:02:32 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container cilium-operator ready: true, restart count 2 Jan 14 23:15:55.173: INFO: kube-scheduler-i-05871d1e8f8f620dd started at 2023-01-14 23:12:34 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:55.173: INFO: Container kube-scheduler ready: true, restart count 1 Jan 14 23:15:55.173: INFO: kube-apiserver-i-05871d1e8f8f620dd started at 2023-01-14 23:01:21 +0000 UTC (0+2 container statuses recorded) Jan 14 23:15:55.173: INFO: Container healthcheck ready: true, restart count 1 Jan 14 23:15:55.173: INFO: Container kube-apiserver ready: true, restart count 2 Jan 14 23:15:55.701: INFO: Latency metrics for node i-05871d1e8f8f620dd Jan 14 23:15:55.701: INFO: Logging node info for node i-066162bc3bb041a75 Jan 14 23:15:55.844: INFO: Node Info: &Node{ObjectMeta:{i-066162bc3bb041a75 0526e693-6fb3-418d-96b5-51e71caa90b0 10257 0 2023-01-14 23:03:45 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:t3.medium beta.kubernetes.io/os:linux failure-domain.beta.kubernetes.io/region:sa-east-1 failure-domain.beta.kubernetes.io/zone:sa-east-1a kubernetes.io/arch:amd64 kubernetes.io/hostname:i-066162bc3bb041a75 kubernetes.io/os:linux node-role.kubernetes.io/node: node.kubernetes.io/instance-type:t3.medium topology.ebs.csi.aws.com/zone:sa-east-1a topology.hostpath.csi/node:i-066162bc3bb041a75 topology.kubernetes.io/region:sa-east-1 topology.kubernetes.io/zone:sa-east-1a] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-ephemeral-31":"i-066162bc3bb041a75","ebs.csi.aws.com":"i-066162bc3bb041a75"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{aws-cloud-controller-manager Update v1 2023-01-14 23:03:45 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:beta.kubernetes.io/instance-type":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {aws-cloud-controller-manager Update v1 2023-01-14 23:03:45 +0000 UTC FieldsV1 {"f:status":{"f:addresses":{"k:{\"type\":\"ExternalDNS\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"ExternalIP\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"Hostname\"}":{"f:address":{}},"k:{\"type\":\"InternalDNS\"}":{".":{},"f:address":{},"f:type":{}}}}} status} {kops-controller Update v1 2023-01-14 23:03:45 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:node-role.kubernetes.io/node":{}}}} } {kubelet Update v1 2023-01-14 23:03:45 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/os":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:07:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"100.96.4.0/24\"":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:15:54 +0000 UTC FieldsV1 {"f:status":{"f:volumesAttached":{}}} status} {kubelet Update v1 2023-01-14 23:15:55 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.ebs.csi.aws.com/zone":{},"f:topology.hostpath.csi/node":{}}},"f:status":{"f:allocatable":{"f:memory":{}},"f:capacity":{"f:memory":{}},"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{},"f:nodeInfo":{"f:bootID":{},"f:kernelVersion":{},"f:osImage":{}},"f:volumesInUse":{}}} status}]},Spec:NodeSpec{PodCIDR:100.96.4.0/24,DoNotUseExternalID:,ProviderID:aws:///sa-east-1a/i-066162bc3bb041a75,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[100.96.4.0/24],},Status:NodeStatus{Capacity:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{47441653760 0} {<nil>} 46329740Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{4054806528 0} {<nil>} 3959772Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{42697488314 0} {<nil>} 42697488314 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3949948928 0} {<nil>} 3857372Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:03:45 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:03:45 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:03:45 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:07:33 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:172.20.51.127,},NodeAddress{Type:ExternalIP,Address:18.231.122.66,},NodeAddress{Type:InternalDNS,Address:i-066162bc3bb041a75.sa-east-1.compute.internal,},NodeAddress{Type:Hostname,Address:i-066162bc3bb041a75.sa-east-1.compute.internal,},NodeAddress{Type:ExternalDNS,Address:ec2-18-231-122-66.sa-east-1.compute.amazonaws.com,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:ec2e86a7c421dab27d8bac20db5aee15,SystemUUID:ec2e86a7-c421-dab2-7d8b-ac20db5aee15,BootID:ab9d24e4-d4dc-42b2-b4eb-6eadf4c9e778,KernelVersion:5.15.86-flatcar,OSImage:Flatcar Container Linux by Kinvolk 3446.1.0 (Oklo),ContainerRuntimeVersion:containerd://1.6.10,KubeletVersion:v1.25.5,KubeProxyVersion:v1.25.5,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[quay.io/cilium/cilium@sha256:06ce2b0a0a472e73334a7504ee5c5d8b2e2d7b72ef728ad94e564740dd505be5 quay.io/cilium/cilium:v1.12.5],SizeBytes:166719855,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.25.5],SizeBytes:63291081,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:af7e3857d87770ddb40f5ea4f89b5a2709504ab1ee31f9ea4ab5823c045f2146 registry.k8s.io/e2e-test-images/agnhost:2.40],SizeBytes:51155161,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nautilus@sha256:99c0d6f1ad24a1aa1905d9c6534d193f268f7b23f9add2ae6bb41f31094bdd5c registry.k8s.io/e2e-test-images/nautilus:1.5],SizeBytes:49642095,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:1b9d1b2f36cb2dbee1960e82a9344aeb11bd4c4c03abf5e1853e0559c23855e3 registry.k8s.io/e2e-test-images/httpd:2.4.38-2],SizeBytes:40764680,},ContainerImage{Names:[registry.k8s.io/provider-aws/aws-ebs-csi-driver@sha256:f0c5de192d832e7c1daa6580d4a62e8fa6fc8eabc0917ae4cb7ed4d15e95b59e registry.k8s.io/provider-aws/aws-ebs-csi-driver:v1.14.1],SizeBytes:29725845,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:122bfb8c1edabb3c0edd63f06523e6940d958d19b3957dc7b1d6f81e9f1f6119 registry.k8s.io/sig-storage/csi-provisioner:v3.1.0],SizeBytes:23345856,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:6477988532358148d2e98f7c747db4e9250bbc7ad2664bf666348abf9ee1f5aa registry.k8s.io/sig-storage/csi-provisioner:v3.0.0],SizeBytes:22728994,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:9ebbf9f023e7b41ccee3d52afe39a89e3ddacdbb69269d583abfc25847cfd9e4 registry.k8s.io/sig-storage/csi-resizer:v1.4.0],SizeBytes:22381475,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:89e900a160a986a1a7a4eba7f5259e510398fa87ca9b8a729e7dec59e04c7709 registry.k8s.io/sig-storage/csi-snapshotter:v5.0.1],SizeBytes:22163966,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:8b9c313c05f54fb04f8d430896f5f5904b6cb157df261501b29adc04d2b2dc7b registry.k8s.io/sig-storage/csi-attacher:v3.4.0],SizeBytes:22085298,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:80dec81b679a733fda448be92a2331150d99095947d04003ecff3dbd7f2a476a registry.k8s.io/sig-storage/csi-attacher:v3.3.0],SizeBytes:21444261,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:6029c252dae6178c99b580de72d7776158edbc81be0de15cedc4152a3acfed18 registry.k8s.io/sig-storage/hostpathplugin:v1.7.3],SizeBytes:15224494,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:4fd21f36075b44d1a423dfb262ad79202ce54e95f5cbc4622a6c1c38ab287ad6 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.0],SizeBytes:9132637,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:f9bcee63734b7b01555ee8fc8fb01ac2922478b2c8934bf8d468dd2916edc405 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.3.0],SizeBytes:8582494,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:406f59599991916d2942d8d02f076d957ed71b541ee19f09fc01723a6e6f5932 registry.k8s.io/sig-storage/livenessprobe:v2.6.0],SizeBytes:8240918,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nginx@sha256:55d0552eb6538050ea7741e46b35d27eccffeeaed7010f9f2bad0a89c149bc6f registry.k8s.io/e2e-test-images/nginx:1.15-2],SizeBytes:7000509,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nginx@sha256:13616070e3f29de4417eee434a8ef472221c9e51b3d037b5a6b46cef08eb7443 registry.k8s.io/e2e-test-images/nginx:1.14-2],SizeBytes:6979041,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},ContainerImage{Names:[registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db registry.k8s.io/pause:3.6],SizeBytes:301773,},},VolumesInUse:[kubernetes.io/csi/csi-hostpath-ephemeral-31^46cc37a0-9461-11ed-bb3c-12092b7cfbec kubernetes.io/csi/ebs.csi.aws.com^vol-000cfe054c6351b33 kubernetes.io/csi/ebs.csi.aws.com^vol-00de75837a003506d kubernetes.io/csi/ebs.csi.aws.com^vol-041f8d115c8ba31e7],VolumesAttached:[]AttachedVolume{AttachedVolume{Name:kubernetes.io/csi/ebs.csi.aws.com^vol-000cfe054c6351b33,DevicePath:,},AttachedVolume{Name:kubernetes.io/csi/ebs.csi.aws.com^vol-00de75837a003506d,DevicePath:,},AttachedVolume{Name:kubernetes.io/csi/csi-hostpath-ephemeral-31^46cc37a0-9461-11ed-bb3c-12092b7cfbec,DevicePath:,},},Config:nil,},} Jan 14 23:15:55.844: INFO: Logging kubelet events for node i-066162bc3bb041a75 Jan 14 23:15:55.993: INFO: Logging pods the kubelet thinks is on node i-066162bc3bb041a75 Jan 14 23:15:56.143: INFO: ebs-csi-node-275bh started at 2023-01-14 23:03:47 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:56.143: INFO: Container ebs-plugin ready: true, restart count 1 Jan 14 23:15:56.143: INFO: Container liveness-probe ready: true, restart count 1 Jan 14 23:15:56.143: INFO: Container node-driver-registrar ready: true, restart count 1 Jan 14 23:15:56.143: INFO: pod-client started at 2023-01-14 23:15:22 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container pod-client ready: true, restart count 0 Jan 14 23:15:56.143: INFO: pod-460711b7-9971-4d0e-924e-2577d4d9e7dc started at 2023-01-14 23:15:45 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container write-pod ready: false, restart count 0 Jan 14 23:15:56.143: INFO: pod-subpath-test-dynamicpv-hfdn started at 2023-01-14 23:15:46 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container test-container-subpath-dynamicpv-hfdn ready: false, restart count 0 Jan 14 23:15:56.143: INFO: deployment-shared-unset-79c9978db8-x42vc started at 2023-01-14 23:15:01 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container nginx ready: false, restart count 0 Jan 14 23:15:56.143: INFO: up-down-2-zx72p started at 2023-01-14 23:13:27 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container up-down-2 ready: true, restart count 0 Jan 14 23:15:56.143: INFO: inline-volume-rvxx9 started at 2023-01-14 23:15:17 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container volume-tester ready: false, restart count 0 Jan 14 23:15:56.143: INFO: sample-webhook-deployment-5d85dd8cdb-7gpv4 started at 2023-01-14 23:10:42 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container sample-webhook ready: true, restart count 0 Jan 14 23:15:56.143: INFO: up-down-2-h9qb6 started at 2023-01-14 23:10:20 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container up-down-2 ready: true, restart count 0 Jan 14 23:15:56.143: INFO: inline-volume-tester-9kvds started at 2023-01-14 23:15:04 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container csi-volume-tester ready: true, restart count 0 Jan 14 23:15:56.143: INFO: cilium-bncbp started at 2023-01-14 23:07:23 +0000 UTC (1+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Init container clean-cilium-state ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container cilium-agent ready: true, restart count 0 Jan 14 23:15:56.143: INFO: up-down-1-s45cs started at 2023-01-14 23:13:27 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container up-down-1 ready: true, restart count 0 Jan 14 23:15:56.143: INFO: inline-volume-tester-vxrvt started at 2023-01-14 23:10:17 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container csi-volume-tester ready: true, restart count 0 Jan 14 23:15:56.143: INFO: csi-hostpathplugin-0 started at 2023-01-14 23:14:57 +0000 UTC (0+7 container statuses recorded) Jan 14 23:15:56.143: INFO: Container csi-attacher ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container csi-resizer ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container csi-snapshotter ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container hostpath ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container liveness-probe ready: true, restart count 0 Jan 14 23:15:56.143: INFO: Container node-driver-registrar ready: true, restart count 0 Jan 14 23:15:56.143: INFO: service-headless-toggled-vsvkj started at 2023-01-14 23:14:51 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container service-headless-toggled ready: true, restart count 0 Jan 14 23:15:56.143: INFO: service-headless-2fr9t started at 2023-01-14 23:14:44 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container service-headless ready: true, restart count 0 Jan 14 23:15:56.143: INFO: up-down-1-86crx started at 2023-01-14 23:10:08 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container up-down-1 ready: true, restart count 0 Jan 14 23:15:56.143: INFO: suspend-false-to-true-p769q started at 2023-01-14 23:15:14 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container c ready: true, restart count 0 Jan 14 23:15:56.143: INFO: simpletest.rc-l9vzk started at 2023-01-14 23:15:35 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container nginx ready: true, restart count 0 Jan 14 23:15:56.143: INFO: ss2-0 started at 2023-01-14 23:11:27 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:56.143: INFO: Container webserver ready: true, restart count 0 Jan 14 23:15:56.954: INFO: Latency metrics for node i-066162bc3bb041a75 Jan 14 23:15:56.954: INFO: Logging node info for node i-07f0d0bc50c0f4aa8 Jan 14 23:15:57.096: INFO: Node Info: &Node{ObjectMeta:{i-07f0d0bc50c0f4aa8 3227d16c-9ac6-4050-ac64-ead2defa0858 8193 0 2023-01-14 23:03:29 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:t3.medium beta.kubernetes.io/os:linux failure-domain.beta.kubernetes.io/region:sa-east-1 failure-domain.beta.kubernetes.io/zone:sa-east-1a kubernetes.io/arch:amd64 kubernetes.io/hostname:i-07f0d0bc50c0f4aa8 kubernetes.io/os:linux node-role.kubernetes.io/node: node.kubernetes.io/instance-type:t3.medium topology.ebs.csi.aws.com/zone:sa-east-1a topology.hostpath.csi/node:i-07f0d0bc50c0f4aa8 topology.kubernetes.io/region:sa-east-1 topology.kubernetes.io/zone:sa-east-1a] map[csi.volume.kubernetes.io/nodeid:{"csi-mock-csi-mock-volumes-5501":"i-07f0d0bc50c0f4aa8","ebs.csi.aws.com":"i-07f0d0bc50c0f4aa8"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{aws-cloud-controller-manager Update v1 2023-01-14 23:03:29 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:beta.kubernetes.io/instance-type":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {aws-cloud-controller-manager Update v1 2023-01-14 23:03:29 +0000 UTC FieldsV1 {"f:status":{"f:addresses":{"k:{\"type\":\"ExternalDNS\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"ExternalIP\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"Hostname\"}":{"f:address":{}},"k:{\"type\":\"InternalDNS\"}":{".":{},"f:address":{},"f:type":{}}}}} status} {kops-controller Update v1 2023-01-14 23:03:29 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:node-role.kubernetes.io/node":{}}}} } {kubelet Update v1 2023-01-14 23:03:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/os":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:13:27 +0000 UTC FieldsV1 {"f:status":{"f:volumesAttached":{}}} status} {kube-controller-manager Update v1 2023-01-14 23:14:36 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"100.96.1.0/24\"":{}}}} } {kubelet Update v1 2023-01-14 23:14:56 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.ebs.csi.aws.com/zone":{},"f:topology.hostpath.csi/node":{}}},"f:status":{"f:allocatable":{"f:ephemeral-storage":{},"f:memory":{}},"f:capacity":{"f:memory":{}},"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{},"f:nodeInfo":{"f:bootID":{},"f:kernelVersion":{},"f:osImage":{}},"f:volumesInUse":{}}} status}]},Spec:NodeSpec{PodCIDR:100.96.1.0/24,DoNotUseExternalID:,ProviderID:aws:///sa-east-1a/i-07f0d0bc50c0f4aa8,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[100.96.1.0/24],},Status:NodeStatus{Capacity:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{47441653760 0} {<nil>} 46329740Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{4054806528 0} {<nil>} 3959772Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{42697488314 0} {<nil>} 42697488314 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3949948928 0} {<nil>} 3857372Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2023-01-14 23:14:56 +0000 UTC,LastTransitionTime:2023-01-14 23:03:19 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2023-01-14 23:14:56 +0000 UTC,LastTransitionTime:2023-01-14 23:03:19 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2023-01-14 23:14:56 +0000 UTC,LastTransitionTime:2023-01-14 23:03:19 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2023-01-14 23:14:56 +0000 UTC,LastTransitionTime:2023-01-14 23:14:35 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:172.20.40.234,},NodeAddress{Type:ExternalIP,Address:18.231.119.15,},NodeAddress{Type:InternalDNS,Address:i-07f0d0bc50c0f4aa8.sa-east-1.compute.internal,},NodeAddress{Type:Hostname,Address:i-07f0d0bc50c0f4aa8.sa-east-1.compute.internal,},NodeAddress{Type:ExternalDNS,Address:ec2-18-231-119-15.sa-east-1.compute.amazonaws.com,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:ec2e143f65373273e7301648c5c4c865,SystemUUID:ec2e143f-6537-3273-e730-1648c5c4c865,BootID:6ebc5109-de5f-4376-aa99-6e1d047f42c1,KernelVersion:5.15.86-flatcar,OSImage:Flatcar Container Linux by Kinvolk 3446.1.0 (Oklo),ContainerRuntimeVersion:containerd://1.6.10,KubeletVersion:v1.25.5,KubeProxyVersion:v1.25.5,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[quay.io/cilium/cilium@sha256:06ce2b0a0a472e73334a7504ee5c5d8b2e2d7b72ef728ad94e564740dd505be5 quay.io/cilium/cilium:v1.12.5],SizeBytes:166719855,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.25.5],SizeBytes:63291081,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:af7e3857d87770ddb40f5ea4f89b5a2709504ab1ee31f9ea4ab5823c045f2146 registry.k8s.io/e2e-test-images/agnhost:2.40],SizeBytes:51155161,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:1b9d1b2f36cb2dbee1960e82a9344aeb11bd4c4c03abf5e1853e0559c23855e3 registry.k8s.io/e2e-test-images/httpd:2.4.38-2],SizeBytes:40764680,},ContainerImage{Names:[registry.k8s.io/provider-aws/aws-ebs-csi-driver@sha256:f0c5de192d832e7c1daa6580d4a62e8fa6fc8eabc0917ae4cb7ed4d15e95b59e registry.k8s.io/provider-aws/aws-ebs-csi-driver:v1.14.1],SizeBytes:29725845,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:6477988532358148d2e98f7c747db4e9250bbc7ad2664bf666348abf9ee1f5aa registry.k8s.io/sig-storage/csi-provisioner:v3.0.0],SizeBytes:22728994,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:6e0546563b18872b0aa0cad7255a26bb9a87cb879b7fc3e2383c867ef4f706fb registry.k8s.io/sig-storage/csi-resizer:v1.3.0],SizeBytes:21671340,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:80dec81b679a733fda448be92a2331150d99095947d04003ecff3dbd7f2a476a registry.k8s.io/sig-storage/csi-attacher:v3.3.0],SizeBytes:21444261,},ContainerImage{Names:[registry.k8s.io/cpa/cluster-proportional-autoscaler@sha256:68d396900aeaa072c1f27289485fdac29834045a6f3ffe369bf389d830ef572d registry.k8s.io/cpa/cluster-proportional-autoscaler:1.8.6],SizeBytes:20293261,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nonroot@sha256:b9e2958a3dd879e3cf11142228c6d073d0fc4ea2e857c3be6f4fb0ab5fb2c937 registry.k8s.io/e2e-test-images/nonroot:1.2],SizeBytes:17748301,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:017727efcfeb7d053af68e51436ce8e65edbc6ca573720afb4f79c8594036955 registry.k8s.io/coredns/coredns:v1.10.0],SizeBytes:15273057,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:6029c252dae6178c99b580de72d7776158edbc81be0de15cedc4152a3acfed18 registry.k8s.io/sig-storage/hostpathplugin:v1.7.3],SizeBytes:15224494,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:f9bcee63734b7b01555ee8fc8fb01ac2922478b2c8934bf8d468dd2916edc405 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.3.0],SizeBytes:8582494,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:406f59599991916d2942d8d02f076d957ed71b541ee19f09fc01723a6e6f5932 registry.k8s.io/sig-storage/livenessprobe:v2.6.0],SizeBytes:8240918,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},ContainerImage{Names:[registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db registry.k8s.io/pause:3.6],SizeBytes:301773,},},VolumesInUse:[kubernetes.io/csi/csi-mock-csi-mock-volumes-5501^b3d7a8d0-9460-11ed-abd7-5ea27f8adac5 kubernetes.io/csi/ebs.csi.aws.com^vol-04ad01fd346dce2ce kubernetes.io/csi/ebs.csi.aws.com^vol-0d0a697d91e57c4fb],VolumesAttached:[]AttachedVolume{AttachedVolume{Name:kubernetes.io/csi/ebs.csi.aws.com^vol-04ad01fd346dce2ce,DevicePath:,},AttachedVolume{Name:kubernetes.io/csi/csi-mock-csi-mock-volumes-5501^b3d7a8d0-9460-11ed-abd7-5ea27f8adac5,DevicePath:,},AttachedVolume{Name:kubernetes.io/csi/ebs.csi.aws.com^vol-0d0a697d91e57c4fb,DevicePath:,},},Config:nil,},} Jan 14 23:15:57.097: INFO: Logging kubelet events for node i-07f0d0bc50c0f4aa8 Jan 14 23:15:57.259: INFO: Logging pods the kubelet thinks is on node i-07f0d0bc50c0f4aa8 Jan 14 23:15:57.410: INFO: up-down-1-sf2rt started at 2023-01-14 23:10:48 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container up-down-1 ready: true, restart count 0 Jan 14 23:15:57.410: INFO: hostpath-symlink-prep-provisioning-5720 started at 2023-01-14 23:10:58 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container init-volume-provisioning-5720 ready: false, restart count 0 Jan 14 23:15:57.410: INFO: pod-server-1 started at 2023-01-14 23:15:29 +0000 UTC (1+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Init container init ready: true, restart count 0 Jan 14 23:15:57.410: INFO: Container agnhost-container ready: true, restart count 0 Jan 14 23:15:57.410: INFO: hostpath-symlink-prep-provisioning-6824 started at 2023-01-14 23:15:54 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container init-volume-provisioning-6824 ready: false, restart count 0 Jan 14 23:15:57.410: INFO: cilium-qhbzx started at 2023-01-14 23:04:53 +0000 UTC (1+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Init container clean-cilium-state ready: true, restart count 0 Jan 14 23:15:57.410: INFO: Container cilium-agent ready: true, restart count 2 Jan 14 23:15:57.410: INFO: up-down-2-npbns started at 2023-01-14 23:10:20 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container up-down-2 ready: true, restart count 0 Jan 14 23:15:57.410: INFO: pvc-volume-tester-t99pk started at 2023-01-14 23:10:57 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container volume-tester ready: true, restart count 0 Jan 14 23:15:57.410: INFO: csi-mockplugin-0 started at 2023-01-14 23:10:14 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:57.410: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:57.410: INFO: Container driver-registrar ready: true, restart count 0 Jan 14 23:15:57.410: INFO: Container mock ready: true, restart count 0 Jan 14 23:15:57.410: INFO: external-injector started at 2023-01-14 23:10:46 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container external-injector ready: true, restart count 0 Jan 14 23:15:57.410: INFO: inline-volume-tester2-2k6sw started at 2023-01-14 23:10:44 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container csi-volume-tester ready: true, restart count 0 Jan 14 23:15:57.410: INFO: ebs-csi-node-tshrx started at 2023-01-14 23:03:30 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:57.410: INFO: Container ebs-plugin ready: true, restart count 1 Jan 14 23:15:57.410: INFO: Container liveness-probe ready: true, restart count 0 Jan 14 23:15:57.410: INFO: Container node-driver-registrar ready: true, restart count 0 Jan 14 23:15:57.410: INFO: test-rolling-update-deployment-78f575d8ff-4wnbk started at 2023-01-14 23:10:43 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container agnhost ready: true, restart count 0 Jan 14 23:15:57.410: INFO: hostexec-i-07f0d0bc50c0f4aa8-54wh2 started at 2023-01-14 23:10:09 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container agnhost-container ready: true, restart count 0 Jan 14 23:15:57.410: INFO: csi-mockplugin-attacher-0 started at 2023-01-14 23:10:14 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container csi-attacher ready: true, restart count 0 Jan 14 23:15:57.410: INFO: csi-mockplugin-resizer-0 started at 2023-01-14 23:10:14 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container csi-resizer ready: true, restart count 0 Jan 14 23:15:57.410: INFO: hostexec-i-07f0d0bc50c0f4aa8-dccm4 started at 2023-01-14 23:15:56 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container agnhost-container ready: false, restart count 0 Jan 14 23:15:57.410: INFO: success started at 2023-01-14 23:10:43 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container success ready: false, restart count 0 Jan 14 23:15:57.410: INFO: coredns-85d58b74c8-54zqn started at 2023-01-14 23:03:50 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container coredns ready: true, restart count 0 Jan 14 23:15:57.410: INFO: coredns-autoscaler-5b9dc8bb99-vhqzx started at 2023-01-14 23:03:50 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:57.410: INFO: Container autoscaler ready: true, restart count 1 Jan 14 23:15:57.930: INFO: Latency metrics for node i-07f0d0bc50c0f4aa8 Jan 14 23:15:57.930: INFO: Logging node info for node i-0dcc151940a349dcc Jan 14 23:15:58.073: INFO: Node Info: &Node{ObjectMeta:{i-0dcc151940a349dcc 1e379adb-7a0c-4de3-a05c-6cc44a0ee666 10239 0 2023-01-14 23:03:36 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:t3.medium beta.kubernetes.io/os:linux failure-domain.beta.kubernetes.io/region:sa-east-1 failure-domain.beta.kubernetes.io/zone:sa-east-1a kubernetes.io/arch:amd64 kubernetes.io/hostname:i-0dcc151940a349dcc kubernetes.io/os:linux node-role.kubernetes.io/node: node.kubernetes.io/instance-type:t3.medium topology.ebs.csi.aws.com/zone:sa-east-1a topology.hostpath.csi/node:i-0dcc151940a349dcc topology.kubernetes.io/region:sa-east-1 topology.kubernetes.io/zone:sa-east-1a] map[csi.volume.kubernetes.io/nodeid:{"csi-hostpath-ephemeral-863":"i-0dcc151940a349dcc","csi-hostpath-volume-expand-7436":"i-0dcc151940a349dcc","csi-mock-csi-mock-volumes-2185":"i-0dcc151940a349dcc","csi-mock-csi-mock-volumes-6932":"csi-mock-csi-mock-volumes-6932","ebs.csi.aws.com":"i-0dcc151940a349dcc"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{aws-cloud-controller-manager Update v1 2023-01-14 23:03:36 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:beta.kubernetes.io/instance-type":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {aws-cloud-controller-manager Update v1 2023-01-14 23:03:36 +0000 UTC FieldsV1 {"f:status":{"f:addresses":{"k:{\"type\":\"ExternalDNS\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"ExternalIP\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"Hostname\"}":{"f:address":{}},"k:{\"type\":\"InternalDNS\"}":{".":{},"f:address":{},"f:type":{}}}}} status} {kops-controller Update v1 2023-01-14 23:03:36 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:node-role.kubernetes.io/node":{}}}} } {kubelet Update v1 2023-01-14 23:03:36 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/os":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:13:27 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"100.96.3.0/24\"":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:15:55 +0000 UTC FieldsV1 {"f:status":{"f:volumesAttached":{}}} status} {kubelet Update v1 2023-01-14 23:15:55 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.ebs.csi.aws.com/zone":{},"f:topology.hostpath.csi/node":{}}},"f:status":{"f:allocatable":{"f:memory":{}},"f:capacity":{"f:memory":{}},"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{},"f:nodeInfo":{"f:bootID":{},"f:kernelVersion":{},"f:osImage":{}},"f:volumesInUse":{}}} status}]},Spec:NodeSpec{PodCIDR:100.96.3.0/24,DoNotUseExternalID:,ProviderID:aws:///sa-east-1a/i-0dcc151940a349dcc,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[100.96.3.0/24],},Status:NodeStatus{Capacity:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{47441653760 0} {<nil>} 46329740Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{4054806528 0} {<nil>} 3959772Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{42697488314 0} {<nil>} 42697488314 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3949948928 0} {<nil>} 3857372Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:03:15 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:03:15 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:03:15 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2023-01-14 23:15:55 +0000 UTC,LastTransitionTime:2023-01-14 23:11:44 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:172.20.49.81,},NodeAddress{Type:ExternalIP,Address:54.232.229.150,},NodeAddress{Type:InternalDNS,Address:i-0dcc151940a349dcc.sa-east-1.compute.internal,},NodeAddress{Type:Hostname,Address:i-0dcc151940a349dcc.sa-east-1.compute.internal,},NodeAddress{Type:ExternalDNS,Address:ec2-54-232-229-150.sa-east-1.compute.amazonaws.com,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:ec212aac290f3baddeca89a333de750f,SystemUUID:ec212aac-290f-3bad-deca-89a333de750f,BootID:5760eb4f-fa21-41ce-9a12-65937b99b4df,KernelVersion:5.15.86-flatcar,OSImage:Flatcar Container Linux by Kinvolk 3446.1.0 (Oklo),ContainerRuntimeVersion:containerd://1.6.10,KubeletVersion:v1.25.5,KubeProxyVersion:v1.25.5,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[quay.io/cilium/cilium@sha256:06ce2b0a0a472e73334a7504ee5c5d8b2e2d7b72ef728ad94e564740dd505be5 quay.io/cilium/cilium:v1.12.5],SizeBytes:166719855,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.25.5],SizeBytes:63291081,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:af7e3857d87770ddb40f5ea4f89b5a2709504ab1ee31f9ea4ab5823c045f2146 registry.k8s.io/e2e-test-images/agnhost:2.40],SizeBytes:51155161,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nautilus@sha256:99c0d6f1ad24a1aa1905d9c6534d193f268f7b23f9add2ae6bb41f31094bdd5c registry.k8s.io/e2e-test-images/nautilus:1.5],SizeBytes:49642095,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:20f25f275d46aa728f7615a1ccc19c78b2ed89435bf943a44b339f70f45508e6 registry.k8s.io/e2e-test-images/httpd:2.4.39-2],SizeBytes:41902010,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:1b9d1b2f36cb2dbee1960e82a9344aeb11bd4c4c03abf5e1853e0559c23855e3 registry.k8s.io/e2e-test-images/httpd:2.4.38-2],SizeBytes:40764680,},ContainerImage{Names:[registry.k8s.io/provider-aws/aws-ebs-csi-driver@sha256:f0c5de192d832e7c1daa6580d4a62e8fa6fc8eabc0917ae4cb7ed4d15e95b59e registry.k8s.io/provider-aws/aws-ebs-csi-driver:v1.14.1],SizeBytes:29725845,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:122bfb8c1edabb3c0edd63f06523e6940d958d19b3957dc7b1d6f81e9f1f6119 registry.k8s.io/sig-storage/csi-provisioner:v3.1.0],SizeBytes:23345856,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:6477988532358148d2e98f7c747db4e9250bbc7ad2664bf666348abf9ee1f5aa registry.k8s.io/sig-storage/csi-provisioner:v3.0.0],SizeBytes:22728994,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:9ebbf9f023e7b41ccee3d52afe39a89e3ddacdbb69269d583abfc25847cfd9e4 registry.k8s.io/sig-storage/csi-resizer:v1.4.0],SizeBytes:22381475,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:89e900a160a986a1a7a4eba7f5259e510398fa87ca9b8a729e7dec59e04c7709 registry.k8s.io/sig-storage/csi-snapshotter:v5.0.1],SizeBytes:22163966,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:8b9c313c05f54fb04f8d430896f5f5904b6cb157df261501b29adc04d2b2dc7b registry.k8s.io/sig-storage/csi-attacher:v3.4.0],SizeBytes:22085298,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:6e0546563b18872b0aa0cad7255a26bb9a87cb879b7fc3e2383c867ef4f706fb registry.k8s.io/sig-storage/csi-resizer:v1.3.0],SizeBytes:21671340,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:80dec81b679a733fda448be92a2331150d99095947d04003ecff3dbd7f2a476a registry.k8s.io/sig-storage/csi-attacher:v3.3.0],SizeBytes:21444261,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:6029c252dae6178c99b580de72d7776158edbc81be0de15cedc4152a3acfed18 registry.k8s.io/sig-storage/hostpathplugin:v1.7.3],SizeBytes:15224494,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:4fd21f36075b44d1a423dfb262ad79202ce54e95f5cbc4622a6c1c38ab287ad6 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.0],SizeBytes:9132637,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:f9bcee63734b7b01555ee8fc8fb01ac2922478b2c8934bf8d468dd2916edc405 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.3.0],SizeBytes:8582494,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:406f59599991916d2942d8d02f076d957ed71b541ee19f09fc01723a6e6f5932 registry.k8s.io/sig-storage/livenessprobe:v2.6.0],SizeBytes:8240918,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nginx@sha256:55d0552eb6538050ea7741e46b35d27eccffeeaed7010f9f2bad0a89c149bc6f registry.k8s.io/e2e-test-images/nginx:1.15-2],SizeBytes:7000509,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nginx@sha256:13616070e3f29de4417eee434a8ef472221c9e51b3d037b5a6b46cef08eb7443 registry.k8s.io/e2e-test-images/nginx:1.14-2],SizeBytes:6979041,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db registry.k8s.io/pause:3.6],SizeBytes:301773,},},VolumesInUse:[kubernetes.io/csi/csi-mock-csi-mock-volumes-2185^54cc7c14-9461-11ed-9327-ca4794bfc09b kubernetes.io/csi/ebs.csi.aws.com^vol-04f1939b81cf27b39],VolumesAttached:[]AttachedVolume{AttachedVolume{Name:kubernetes.io/csi/ebs.csi.aws.com^vol-04f1939b81cf27b39,DevicePath:,},AttachedVolume{Name:kubernetes.io/csi/csi-mock-csi-mock-volumes-2185^54cc7c14-9461-11ed-9327-ca4794bfc09b,DevicePath:,},},Config:nil,},} Jan 14 23:15:58.073: INFO: Logging kubelet events for node i-0dcc151940a349dcc Jan 14 23:15:58.220: INFO: Logging pods the kubelet thinks is on node i-0dcc151940a349dcc Jan 14 23:15:58.375: INFO: csi-mockplugin-resizer-0 started at 2023-01-14 23:14:44 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container csi-resizer ready: true, restart count 0 Jan 14 23:15:58.375: INFO: busybox-1b27205a-bb74-4e2a-a20b-df031f27be8e started at 2023-01-14 23:15:30 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container busybox ready: true, restart count 0 Jan 14 23:15:58.375: INFO: hostexec-i-0dcc151940a349dcc-sgwcl started at 2023-01-14 23:10:23 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container agnhost-container ready: false, restart count 0 Jan 14 23:15:58.375: INFO: pod-34e4ba74-4761-4f44-b63c-a8443f7a5114 started at 2023-01-14 23:10:29 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container write-pod ready: false, restart count 0 Jan 14 23:15:58.375: INFO: simpletest.rc-hsf2j started at 2023-01-14 23:15:35 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container nginx ready: true, restart count 0 Jan 14 23:15:58.375: INFO: hostexec-i-0dcc151940a349dcc-tfplm started at 2023-01-14 23:15:53 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container agnhost-container ready: true, restart count 0 Jan 14 23:15:58.375: INFO: up-down-1-rbj4h started at 2023-01-14 23:10:08 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container up-down-1 ready: false, restart count 0 Jan 14 23:15:58.375: INFO: csi-mockplugin-0 started at 2023-01-14 23:14:44 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:58.375: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container driver-registrar ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container mock ready: true, restart count 0 Jan 14 23:15:58.375: INFO: ss2-1 started at 2023-01-14 23:14:45 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container webserver ready: true, restart count 0 Jan 14 23:15:58.375: INFO: suspend-false-to-true-qdq7r started at 2023-01-14 23:15:14 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container c ready: true, restart count 0 Jan 14 23:15:58.375: INFO: pod-340be748-7fa4-4166-8def-3fb1a6bde81f started at 2023-01-14 23:10:40 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container write-pod ready: false, restart count 0 Jan 14 23:15:58.375: INFO: startup-6760ae78-3237-4dbd-adbc-a9416e59c8ce started at 2023-01-14 23:15:33 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container busybox ready: false, restart count 0 Jan 14 23:15:58.375: INFO: externalsvc-w5fkj started at 2023-01-14 23:15:50 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container externalsvc ready: true, restart count 0 Jan 14 23:15:58.375: INFO: deployment-shared-unset-79c9978db8-wtt6h started at 2023-01-14 23:15:01 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container nginx ready: false, restart count 0 Jan 14 23:15:58.375: INFO: ss-0 started at 2023-01-14 23:15:45 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container webserver ready: false, restart count 0 Jan 14 23:15:58.375: INFO: up-down-2-kb2nf started at 2023-01-14 23:10:20 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.375: INFO: Container up-down-2 ready: false, restart count 0 Jan 14 23:15:58.375: INFO: csi-hostpathplugin-0 started at 2023-01-14 23:13:46 +0000 UTC (0+7 container statuses recorded) Jan 14 23:15:58.375: INFO: Container csi-attacher ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container csi-resizer ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container csi-snapshotter ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container hostpath ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container liveness-probe ready: true, restart count 0 Jan 14 23:15:58.375: INFO: Container node-driver-registrar ready: true, restart count 0 Jan 14 23:15:58.375: INFO: pvc-volume-tester-kns66 started at 2023-01-14 23:15:27 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.376: INFO: Container volume-tester ready: true, restart count 0 Jan 14 23:15:58.376: INFO: pod-c3cb9f46-d27a-43ff-bdec-b5e216b9f199 started at <nil> (0+0 container statuses recorded) Jan 14 23:15:58.376: INFO: csi-mockplugin-0 started at 2023-01-14 23:15:19 +0000 UTC (0+4 container statuses recorded) Jan 14 23:15:58.376: INFO: Container busybox ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container driver-registrar ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container mock ready: true, restart count 0 Jan 14 23:15:58.376: INFO: service-headless-toggled-8dwxt started at 2023-01-14 23:14:51 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.376: INFO: Container service-headless-toggled ready: true, restart count 0 Jan 14 23:15:58.376: INFO: busybox-ad09cb88-409e-4aa8-8f9e-122718513f03 started at 2023-01-14 23:15:08 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.376: INFO: Container busybox ready: true, restart count 0 Jan 14 23:15:58.376: INFO: busybox-e6afc366-d7ec-4366-9b3a-4ff797626662 started at 2023-01-14 23:10:08 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.376: INFO: Container busybox ready: true, restart count 4 Jan 14 23:15:58.376: INFO: csi-hostpathplugin-0 started at 2023-01-14 23:13:27 +0000 UTC (0+7 container statuses recorded) Jan 14 23:15:58.376: INFO: Container csi-attacher ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container csi-resizer ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container csi-snapshotter ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container hostpath ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container liveness-probe ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container node-driver-registrar ready: true, restart count 0 Jan 14 23:15:58.376: INFO: cilium-vs4sx started at 2023-01-14 23:11:34 +0000 UTC (1+1 container statuses recorded) Jan 14 23:15:58.376: INFO: Init container clean-cilium-state ready: true, restart count 0 Jan 14 23:15:58.376: INFO: Container cilium-agent ready: true, restart count 0 Jan 14 23:15:58.376: INFO: csi-mockplugin-attacher-0 started at 2023-01-14 23:14:44 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:58.376: INFO: Container csi-attacher ready: true, restart count 0 Jan 14 23:15:58.376: INFO: ebs-csi-node-6prjn started at 2023-01-14 23:03:36 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:58.376: INFO: Container ebs-plugin ready: true, restart count 3 Jan 14 23:15:58.376: INFO: Container liveness-probe ready: true, restart count 1 Jan 14 23:15:58.376: INFO: Container node-driver-registrar ready: true, restart count 1 Jan 14 23:15:59.389: INFO: Latency metrics for node i-0dcc151940a349dcc Jan 14 23:15:59.389: INFO: Logging node info for node i-0fcd1e4b56ac2b41b Jan 14 23:15:59.531: INFO: Node Info: &Node{ObjectMeta:{i-0fcd1e4b56ac2b41b f83a3d46-cc3a-48b6-8938-3be8def9e5c0 10005 0 2023-01-14 23:03:33 +0000 UTC <nil> <nil> map[beta.kubernetes.io/arch:amd64 beta.kubernetes.io/instance-type:t3.medium beta.kubernetes.io/os:linux failure-domain.beta.kubernetes.io/region:sa-east-1 failure-domain.beta.kubernetes.io/zone:sa-east-1a kubernetes.io/arch:amd64 kubernetes.io/hostname:i-0fcd1e4b56ac2b41b kubernetes.io/os:linux node-role.kubernetes.io/node: node.kubernetes.io/instance-type:t3.medium topology.ebs.csi.aws.com/zone:sa-east-1a topology.hostpath.csi/node:i-0fcd1e4b56ac2b41b topology.kubernetes.io/region:sa-east-1 topology.kubernetes.io/zone:sa-east-1a] map[csi.volume.kubernetes.io/nodeid:{"csi-mock-csi-mock-volumes-5364":"i-0fcd1e4b56ac2b41b","ebs.csi.aws.com":"i-0fcd1e4b56ac2b41b"} node.alpha.kubernetes.io/ttl:0 volumes.kubernetes.io/controller-managed-attach-detach:true] [] [] [{aws-cloud-controller-manager Update v1 2023-01-14 23:03:33 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:beta.kubernetes.io/instance-type":{},"f:failure-domain.beta.kubernetes.io/region":{},"f:failure-domain.beta.kubernetes.io/zone":{},"f:node.kubernetes.io/instance-type":{},"f:topology.kubernetes.io/region":{},"f:topology.kubernetes.io/zone":{}}},"f:spec":{"f:providerID":{}}} } {aws-cloud-controller-manager Update v1 2023-01-14 23:03:33 +0000 UTC FieldsV1 {"f:status":{"f:addresses":{"k:{\"type\":\"ExternalDNS\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"ExternalIP\"}":{".":{},"f:address":{},"f:type":{}},"k:{\"type\":\"Hostname\"}":{"f:address":{}},"k:{\"type\":\"InternalDNS\"}":{".":{},"f:address":{},"f:type":{}}}}} status} {kops-controller Update v1 2023-01-14 23:03:33 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{"f:node-role.kubernetes.io/node":{}}}} } {kubelet Update v1 2023-01-14 23:03:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:volumes.kubernetes.io/controller-managed-attach-detach":{}},"f:labels":{".":{},"f:beta.kubernetes.io/arch":{},"f:beta.kubernetes.io/os":{},"f:kubernetes.io/arch":{},"f:kubernetes.io/hostname":{},"f:kubernetes.io/os":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:13:36 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:node.alpha.kubernetes.io/ttl":{}}},"f:spec":{"f:podCIDR":{},"f:podCIDRs":{".":{},"v:\"100.96.2.0/24\"":{}}}} } {kube-controller-manager Update v1 2023-01-14 23:15:44 +0000 UTC FieldsV1 {"f:status":{"f:volumesAttached":{}}} status} {kubelet Update v1 2023-01-14 23:15:49 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:csi.volume.kubernetes.io/nodeid":{}},"f:labels":{"f:topology.ebs.csi.aws.com/zone":{},"f:topology.hostpath.csi/node":{}}},"f:status":{"f:allocatable":{"f:memory":{}},"f:capacity":{"f:memory":{}},"f:conditions":{"k:{\"type\":\"DiskPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"MemoryPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"PIDPressure\"}":{"f:lastHeartbeatTime":{}},"k:{\"type\":\"Ready\"}":{"f:lastHeartbeatTime":{},"f:lastTransitionTime":{},"f:message":{},"f:reason":{},"f:status":{}}},"f:images":{},"f:nodeInfo":{"f:bootID":{},"f:kernelVersion":{},"f:osImage":{}},"f:volumesInUse":{}}} status}]},Spec:NodeSpec{PodCIDR:100.96.2.0/24,DoNotUseExternalID:,ProviderID:aws:///sa-east-1a/i-0fcd1e4b56ac2b41b,Unschedulable:false,Taints:[]Taint{},ConfigSource:nil,PodCIDRs:[100.96.2.0/24],},Status:NodeStatus{Capacity:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{47441653760 0} {<nil>} 46329740Ki BinarySI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{4054806528 0} {<nil>} 3959772Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Allocatable:ResourceList{cpu: {{2 0} {<nil>} 2 DecimalSI},ephemeral-storage: {{42697488314 0} {<nil>} 42697488314 DecimalSI},hugepages-1Gi: {{0 0} {<nil>} 0 DecimalSI},hugepages-2Mi: {{0 0} {<nil>} 0 DecimalSI},memory: {{3949948928 0} {<nil>} 3857372Ki BinarySI},pods: {{110 0} {<nil>} 110 DecimalSI},},Phase:,Conditions:[]NodeCondition{NodeCondition{Type:MemoryPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:49 +0000 UTC,LastTransitionTime:2023-01-14 23:03:12 +0000 UTC,Reason:KubeletHasSufficientMemory,Message:kubelet has sufficient memory available,},NodeCondition{Type:DiskPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:49 +0000 UTC,LastTransitionTime:2023-01-14 23:03:12 +0000 UTC,Reason:KubeletHasNoDiskPressure,Message:kubelet has no disk pressure,},NodeCondition{Type:PIDPressure,Status:False,LastHeartbeatTime:2023-01-14 23:15:49 +0000 UTC,LastTransitionTime:2023-01-14 23:03:12 +0000 UTC,Reason:KubeletHasSufficientPID,Message:kubelet has sufficient PID available,},NodeCondition{Type:Ready,Status:True,LastHeartbeatTime:2023-01-14 23:15:49 +0000 UTC,LastTransitionTime:2023-01-14 23:13:36 +0000 UTC,Reason:KubeletReady,Message:kubelet is posting ready status,},},Addresses:[]NodeAddress{NodeAddress{Type:InternalIP,Address:172.20.61.172,},NodeAddress{Type:ExternalIP,Address:54.233.202.152,},NodeAddress{Type:InternalDNS,Address:i-0fcd1e4b56ac2b41b.sa-east-1.compute.internal,},NodeAddress{Type:Hostname,Address:i-0fcd1e4b56ac2b41b.sa-east-1.compute.internal,},NodeAddress{Type:ExternalDNS,Address:ec2-54-233-202-152.sa-east-1.compute.amazonaws.com,},},DaemonEndpoints:NodeDaemonEndpoints{KubeletEndpoint:DaemonEndpoint{Port:10250,},},NodeInfo:NodeSystemInfo{MachineID:ec2d1e6683057f12fb2d5610b0c7b116,SystemUUID:ec2d1e66-8305-7f12-fb2d-5610b0c7b116,BootID:7ffdec91-b0f2-402c-a4d5-3a4bdd6c8c78,KernelVersion:5.15.86-flatcar,OSImage:Flatcar Container Linux by Kinvolk 3446.1.0 (Oklo),ContainerRuntimeVersion:containerd://1.6.10,KubeletVersion:v1.25.5,KubeProxyVersion:v1.25.5,OperatingSystem:linux,Architecture:amd64,},Images:[]ContainerImage{ContainerImage{Names:[quay.io/cilium/cilium@sha256:06ce2b0a0a472e73334a7504ee5c5d8b2e2d7b72ef728ad94e564740dd505be5 quay.io/cilium/cilium:v1.12.5],SizeBytes:166719855,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/jessie-dnsutils@sha256:11e6a66017ba4e4b938c1612b7a54a3befcefd354796c04e1dba76873a13518e registry.k8s.io/e2e-test-images/jessie-dnsutils:1.5],SizeBytes:112030526,},ContainerImage{Names:[registry.k8s.io/kube-proxy-amd64:v1.25.5],SizeBytes:63291081,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/agnhost@sha256:af7e3857d87770ddb40f5ea4f89b5a2709504ab1ee31f9ea4ab5823c045f2146 registry.k8s.io/e2e-test-images/agnhost:2.40],SizeBytes:51155161,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nautilus@sha256:99c0d6f1ad24a1aa1905d9c6534d193f268f7b23f9add2ae6bb41f31094bdd5c registry.k8s.io/e2e-test-images/nautilus:1.5],SizeBytes:49642095,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:20f25f275d46aa728f7615a1ccc19c78b2ed89435bf943a44b339f70f45508e6 registry.k8s.io/e2e-test-images/httpd:2.4.39-2],SizeBytes:41902010,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/httpd@sha256:1b9d1b2f36cb2dbee1960e82a9344aeb11bd4c4c03abf5e1853e0559c23855e3 registry.k8s.io/e2e-test-images/httpd:2.4.38-2],SizeBytes:40764680,},ContainerImage{Names:[registry.k8s.io/provider-aws/aws-ebs-csi-driver@sha256:f0c5de192d832e7c1daa6580d4a62e8fa6fc8eabc0917ae4cb7ed4d15e95b59e registry.k8s.io/provider-aws/aws-ebs-csi-driver:v1.14.1],SizeBytes:29725845,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:122bfb8c1edabb3c0edd63f06523e6940d958d19b3957dc7b1d6f81e9f1f6119 registry.k8s.io/sig-storage/csi-provisioner:v3.1.0],SizeBytes:23345856,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-provisioner@sha256:6477988532358148d2e98f7c747db4e9250bbc7ad2664bf666348abf9ee1f5aa registry.k8s.io/sig-storage/csi-provisioner:v3.0.0],SizeBytes:22728994,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-resizer@sha256:9ebbf9f023e7b41ccee3d52afe39a89e3ddacdbb69269d583abfc25847cfd9e4 registry.k8s.io/sig-storage/csi-resizer:v1.4.0],SizeBytes:22381475,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-snapshotter@sha256:89e900a160a986a1a7a4eba7f5259e510398fa87ca9b8a729e7dec59e04c7709 registry.k8s.io/sig-storage/csi-snapshotter:v5.0.1],SizeBytes:22163966,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:8b9c313c05f54fb04f8d430896f5f5904b6cb157df261501b29adc04d2b2dc7b registry.k8s.io/sig-storage/csi-attacher:v3.4.0],SizeBytes:22085298,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-attacher@sha256:80dec81b679a733fda448be92a2331150d99095947d04003ecff3dbd7f2a476a registry.k8s.io/sig-storage/csi-attacher:v3.3.0],SizeBytes:21444261,},ContainerImage{Names:[registry.k8s.io/coredns/coredns@sha256:017727efcfeb7d053af68e51436ce8e65edbc6ca573720afb4f79c8594036955 registry.k8s.io/coredns/coredns:v1.10.0],SizeBytes:15273057,},ContainerImage{Names:[registry.k8s.io/sig-storage/hostpathplugin@sha256:6029c252dae6178c99b580de72d7776158edbc81be0de15cedc4152a3acfed18 registry.k8s.io/sig-storage/hostpathplugin:v1.7.3],SizeBytes:15224494,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:0103eee7c35e3e0b5cd8cdca9850dc71c793cdeb6669d8be7a89440da2d06ae4 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.1],SizeBytes:9133109,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:4fd21f36075b44d1a423dfb262ad79202ce54e95f5cbc4622a6c1c38ab287ad6 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.5.0],SizeBytes:9132637,},ContainerImage{Names:[registry.k8s.io/sig-storage/csi-node-driver-registrar@sha256:f9bcee63734b7b01555ee8fc8fb01ac2922478b2c8934bf8d468dd2916edc405 registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.3.0],SizeBytes:8582494,},ContainerImage{Names:[registry.k8s.io/sig-storage/livenessprobe@sha256:406f59599991916d2942d8d02f076d957ed71b541ee19f09fc01723a6e6f5932 registry.k8s.io/sig-storage/livenessprobe:v2.6.0],SizeBytes:8240918,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/nginx@sha256:55d0552eb6538050ea7741e46b35d27eccffeeaed7010f9f2bad0a89c149bc6f registry.k8s.io/e2e-test-images/nginx:1.15-2],SizeBytes:7000509,},ContainerImage{Names:[registry.k8s.io/e2e-test-images/busybox@sha256:c318242786b139d18676b1c09a0ad7f15fc17f8f16a5b2e625cd0dc8c9703daf registry.k8s.io/e2e-test-images/busybox:1.29-2],SizeBytes:732424,},ContainerImage{Names:[registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d registry.k8s.io/pause:3.8],SizeBytes:311286,},ContainerImage{Names:[registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db registry.k8s.io/pause:3.6],SizeBytes:301773,},},VolumesInUse:[kubernetes.io/csi/csi-mock-csi-mock-volumes-5364^470cf7d3-9461-11ed-bac9-86b72336a2c4 kubernetes.io/csi/ebs.csi.aws.com^vol-00214871c7b37a25d],VolumesAttached:[]AttachedVolume{AttachedVolume{Name:kubernetes.io/csi/csi-mock-csi-mock-volumes-5364^470cf7d3-9461-11ed-bac9-86b72336a2c4,DevicePath:,},AttachedVolume{Name:kubernetes.io/csi/ebs.csi.aws.com^vol-00214871c7b37a25d,DevicePath:,},},Config:nil,},} Jan 14 23:15:59.532: INFO: Logging kubelet events for node i-0fcd1e4b56ac2b41b Jan 14 23:15:59.679: INFO: Logging pods the kubelet thinks is on node i-0fcd1e4b56ac2b41b Jan 14 23:15:59.828: INFO: coredns-85d58b74c8-8j4pg started at 2023-01-14 23:05:31 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container coredns ready: true, restart count 1 Jan 14 23:15:59.828: INFO: pvc-volume-tester-8blf9 started at 2023-01-14 23:15:04 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container volume-tester ready: true, restart count 0 Jan 14 23:15:59.828: INFO: ebs-csi-node-m6rkw started at 2023-01-14 23:03:33 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:59.828: INFO: Container ebs-plugin ready: true, restart count 1 Jan 14 23:15:59.828: INFO: Container liveness-probe ready: true, restart count 1 Jan 14 23:15:59.828: INFO: Container node-driver-registrar ready: true, restart count 1 Jan 14 23:15:59.828: INFO: hostexec-i-0fcd1e4b56ac2b41b-txd8x started at 2023-01-14 23:10:35 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container agnhost-container ready: false, restart count 0 Jan 14 23:15:59.828: INFO: execpodvs8pf started at 2023-01-14 23:15:54 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container agnhost-container ready: true, restart count 0 Jan 14 23:15:59.828: INFO: pod-804a5874-4d55-4678-a934-e14c5c7d2367 started at 2023-01-14 23:15:42 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container write-pod ready: true, restart count 0 Jan 14 23:15:59.828: INFO: csi-mockplugin-0 started at 2023-01-14 23:14:58 +0000 UTC (0+3 container statuses recorded) Jan 14 23:15:59.828: INFO: Container csi-provisioner ready: true, restart count 0 Jan 14 23:15:59.828: INFO: Container driver-registrar ready: true, restart count 0 Jan 14 23:15:59.828: INFO: Container mock ready: true, restart count 0 Jan 14 23:15:59.828: INFO: deployment-shared-unset-79c9978db8-rbgmh started at 2023-01-14 23:15:01 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container nginx ready: false, restart count 0 Jan 14 23:15:59.828: INFO: up-down-2-zhpqb started at 2023-01-14 23:10:46 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container up-down-2 ready: false, restart count 0 Jan 14 23:15:59.828: INFO: externalsvc-28jz2 started at 2023-01-14 23:15:50 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container externalsvc ready: false, restart count 0 Jan 14 23:15:59.828: INFO: hostexec-i-0fcd1e4b56ac2b41b-xvq5p started at 2023-01-14 23:15:56 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container agnhost-container ready: true, restart count 0 Jan 14 23:15:59.828: INFO: csi-mockplugin-attacher-0 started at 2023-01-14 23:14:58 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container csi-attacher ready: true, restart count 0 Jan 14 23:15:59.828: INFO: cilium-9fvd2 started at 2023-01-14 23:13:27 +0000 UTC (1+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Init container clean-cilium-state ready: true, restart count 0 Jan 14 23:15:59.828: INFO: Container cilium-agent ready: true, restart count 0 Jan 14 23:15:59.828: INFO: up-down-1-fcdxx started at 2023-01-14 23:10:08 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container up-down-1 ready: false, restart count 0 Jan 14 23:15:59.828: INFO: ss2-2 started at 2023-01-14 23:15:13 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container webserver ready: true, restart count 0 Jan 14 23:15:59.828: INFO: downwardapi-volume-87082479-5cf8-4dbf-abfd-e7e8f6c314b6 started at 2023-01-14 23:15:57 +0000 UTC (0+1 container statuses recorded) Jan 14 23:15:59.828: INFO: Container client-container ready: false, restart count 0 Jan 14 23:16:00.313: INFO: Latency metrics for node i-0fcd1e4b56ac2b41b Jan 14 23:16:00.313: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready �[1mSTEP:�[0m Destroying namespace "services-1503" for this suite. �[38;5;243m01/14/23 23:16:00.46�[0m [AfterEach] [sig-network] Services test/e2e/network/service.go:762
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-node\]\sContainer\sRuntime\sblackbox\stest\swhen\srunning\sa\scontainer\swith\sa\snew\simage\sshould\snot\sbe\sable\sto\spull\sfrom\sprivate\sregistry\swithout\ssecret\s\[NodeConformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0008102c0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-node] Container Runtime blackbox test when running a container with a new image should not be able to pull from private registry without secret [NodeConformance]","completed":0,"skipped":19,"failed":2,"failures":["[sig-storage] PersistentVolumes-local [Volume type: dir] Two pods mounting a local volume at the same time should be able to write from pod1 and read from pod2","[sig-node] Container Runtime blackbox test when running a container with a new image should not be able to pull from private registry without secret [NodeConformance]"]} [BeforeEach] [sig-node] Container Runtime test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.165�[0m Jan 14 23:11:46.165: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename container-runtime �[38;5;243m01/14/23 23:11:46.166�[0m Jan 14 23:11:46.320: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.473: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.477: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.477: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.475: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.473: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.475: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.477: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.475: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.475: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.476: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.478: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.474: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.475: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.473: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.739: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.891: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.891: INFO: Unexpected error: <*errors.errorString | 0xc0001bf900>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.891: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0008102c0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-node] Container Runtime test/e2e/framework/framework.go:187 Jan 14 23:12:31.892: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.052: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-node\]\sContainer\sRuntime\sblackbox\stest\swhen\sstarting\sa\scontainer\sthat\sexits\sshould\srun\swith\sthe\sexpected\sstatus\s\[NodeConformance\]\s\[Conformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00072f1e0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-node] Container Runtime blackbox test when starting a container that exits should run with the expected status [NodeConformance] [Conformance]","completed":1,"skipped":43,"failed":2,"failures":["External Storage [Driver: ebs.csi.aws.com] [Testpattern: Dynamic PV (default fs)] volumes should store data","[sig-node] Container Runtime blackbox test when starting a container that exits should run with the expected status [NodeConformance] [Conformance]"]} [BeforeEach] [sig-node] Container Runtime test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.89�[0m Jan 14 23:11:45.890: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename container-runtime �[38;5;243m01/14/23 23:11:45.891�[0m Jan 14 23:11:46.043: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.199: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.198: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.196: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.202: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.198: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.198: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.200: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.199: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.198: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.196: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.195: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.199: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.210: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.194: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.482: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.635: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.635: INFO: Unexpected error: <*errors.errorString | 0xc0001eb900>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.635: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc00072f1e0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-node] Container Runtime test/e2e/framework/framework.go:187 Jan 14 23:12:31.635: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.787: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-node\]\sProbing\scontainer\sshould\sbe\srestarted\swith\sa\sfailing\sexec\sliveness\sprobe\sthat\stook\slonger\sthan\sthe\stimeout$'
test/e2e/common/node/container_probe.go:910 k8s.io/kubernetes/test/e2e/common/node.RunLivenessTest(0xc000441ce0, 0xc001916c00, 0x1, 0x3?) test/e2e/common/node/container_probe.go:910 +0x96b k8s.io/kubernetes/test/e2e/common/node.glob..func2.12() test/e2e/common/node/container_probe.go:270 +0x17afrom junit_01.xml
{"msg":"FAILED [sig-node] Probing container should be restarted with a failing exec liveness probe that took longer than the timeout","completed":0,"skipped":27,"failed":1,"failures":["[sig-node] Probing container should be restarted with a failing exec liveness probe that took longer than the timeout"]} [BeforeEach] [sig-node] Probing container test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.608�[0m Jan 14 23:10:07.608: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename container-probe �[38;5;243m01/14/23 23:10:07.609�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.039�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.32�[0m [BeforeEach] [sig-node] Probing container test/e2e/common/node/container_probe.go:59 [It] should be restarted with a failing exec liveness probe that took longer than the timeout test/e2e/common/node/container_probe.go:261 �[1mSTEP:�[0m Creating pod busybox-e6afc366-d7ec-4366-9b3a-4ff797626662 in namespace container-probe-989 �[38;5;243m01/14/23 23:10:08.603�[0m Jan 14 23:10:08.896: INFO: Waiting up to 5m0s for pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662" in namespace "container-probe-989" to be "not pending" Jan 14 23:10:09.051: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Pending", Reason="", readiness=false. Elapsed: 154.19479ms Jan 14 23:10:11.194: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Pending", Reason="", readiness=false. Elapsed: 2.297179183s Jan 14 23:10:13.193: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Pending", Reason="", readiness=false. Elapsed: 4.29673856s Jan 14 23:10:15.193: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Pending", Reason="", readiness=false. Elapsed: 6.296891261s Jan 14 23:10:17.194: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Pending", Reason="", readiness=false. Elapsed: 8.297234104s Jan 14 23:10:19.205: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Pending", Reason="", readiness=false. Elapsed: 10.308926905s Jan 14 23:10:21.193: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": Phase="Running", Reason="", readiness=true. Elapsed: 12.296624267s Jan 14 23:10:21.193: INFO: Pod "busybox-e6afc366-d7ec-4366-9b3a-4ff797626662" satisfied condition "not pending" Jan 14 23:10:21.193: INFO: Started pod busybox-e6afc366-d7ec-4366-9b3a-4ff797626662 in namespace container-probe-989 �[1mSTEP:�[0m checking the pod's current state and verifying that restartCount is present �[38;5;243m01/14/23 23:10:21.193�[0m Jan 14 23:10:21.335: INFO: Initial restart count of pod busybox-e6afc366-d7ec-4366-9b3a-4ff797626662 is 0 Jan 14 23:11:45.163: INFO: Unexpected error: getting pod : <*url.Error | 0xc002f97d10>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-989/pods/busybox-e6afc366-d7ec-4366-9b3a-4ff797626662", Err: <*net.OpError | 0xc002fd2910>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc003a8be90>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002d893c0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.163: FAIL: getting pod : Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-989/pods/busybox-e6afc366-d7ec-4366-9b3a-4ff797626662": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/common/node.RunLivenessTest(0xc000441ce0, 0xc001916c00, 0x1, 0x3?) test/e2e/common/node/container_probe.go:910 +0x96b k8s.io/kubernetes/test/e2e/common/node.glob..func2.12() test/e2e/common/node/container_probe.go:270 +0x17a �[1mSTEP:�[0m deleting the pod �[38;5;243m01/14/23 23:11:45.163�[0m [AfterEach] [sig-node] Probing container test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "container-probe-989". �[38;5;243m01/14/23 23:11:45.163�[0m Jan 14 23:11:45.317: INFO: Unexpected error: failed to list events in namespace "container-probe-989": <*url.Error | 0xc003ac42a0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-989/events", Err: <*net.OpError | 0xc003ac0230>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc003b22660>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002ec2ae0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:45.318: FAIL: failed to list events in namespace "container-probe-989": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-989/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc003b2f590, {0xc002b40180, 0x13}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0005eac00}, {0xc002b40180, 0x13}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc000441ce0, 0x1?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000441ce0) test/e2e/framework/framework.go:435 +0x21d �[1mSTEP:�[0m Destroying namespace "container-probe-989" for this suite. �[38;5;243m01/14/23 23:11:45.318�[0m Jan 14 23:11:45.470: FAIL: Couldn't delete ns: "container-probe-989": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-989": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-989", Err:(*net.OpError)(0xc003a07180)}) Full Stack Trace panic({0x6ea2520, 0xc003aba780}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0003f4e00}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc003ab8120, 0x109}, {0xc003b2f048?, 0x735bfcc?, 0xc003b2f068?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc002a53500, 0xf4}, {0xc003b2f0e0?, 0xc003abf440?, 0xc003b2f108?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc003ac42a0}, {0xc002ec2b20?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc003b2f590, {0xc002b40180, 0x13}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0005eac00}, {0xc002b40180, 0x13}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc000441ce0, 0x1?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000441ce0) test/e2e/framework/framework.go:435 +0x21d
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-node\]\sProbing\scontainer\sshould\smark\sreadiness\son\spods\sto\sfalse\swhile\spod\sis\sin\sprogress\sof\sterminating\swhen\sa\spod\shas\sa\sreadiness\sprobe$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-node] Probing container should mark readiness on pods to false while pod is in progress of terminating when a pod has a readiness probe","completed":1,"skipped":2,"failed":1,"failures":["[sig-node] Probing container should mark readiness on pods to false while pod is in progress of terminating when a pod has a readiness probe"]} [BeforeEach] [sig-node] Probing container test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:25.63�[0m Jan 14 23:10:25.630: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename container-probe �[38;5;243m01/14/23 23:10:25.631�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:26.066�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:26.349�[0m [BeforeEach] [sig-node] Probing container test/e2e/common/node/container_probe.go:59 [It] should mark readiness on pods to false while pod is in progress of terminating when a pod has a readiness probe test/e2e/common/node/container_probe.go:558 Jan 14 23:10:26.776: INFO: Waiting up to 5m0s for all pods (need at least 1) in namespace 'container-probe-205' to be running and ready Jan 14 23:10:27.202: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Pending (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:27.202: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (0 seconds elapsed) Jan 14 23:10:27.202: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:27.202: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:27.202: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Pending [{PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:27.202: INFO: Jan 14 23:10:29.629: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Pending (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:29.629: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (2 seconds elapsed) Jan 14 23:10:29.629: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:29.629: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:29.629: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Pending [{PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:29.629: INFO: Jan 14 23:10:31.647: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Pending (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:31.647: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (4 seconds elapsed) Jan 14 23:10:31.647: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:31.647: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:31.647: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Pending [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:31.647: INFO: Jan 14 23:10:33.630: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Pending (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:33.630: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (6 seconds elapsed) Jan 14 23:10:33.630: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:33.630: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:33.630: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Pending [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:33.630: INFO: Jan 14 23:10:35.629: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Running (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:35.629: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (8 seconds elapsed) Jan 14 23:10:35.629: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:35.629: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:35.629: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:35.629: INFO: Jan 14 23:10:37.631: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Running (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:37.631: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (10 seconds elapsed) Jan 14 23:10:37.631: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:37.631: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:37.631: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:37.631: INFO: Jan 14 23:10:39.629: INFO: The status of Pod probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 is Running (Ready = false), waiting for it to be either Running (with Ready = true) or Failed Jan 14 23:10:39.629: INFO: 0 / 1 pods in namespace 'container-probe-205' are running and ready (12 seconds elapsed) Jan 14 23:10:39.629: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. Jan 14 23:10:39.629: INFO: POD NODE PHASE GRACE CONDITIONS Jan 14 23:10:39.629: INFO: probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6 i-066162bc3bb041a75 Running [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC ContainersNotReady containers with unready status: [probe-test-930aca9f-2c89-4549-acbe-874b1e1412c6]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-01-14 23:10:26 +0000 UTC }] Jan 14 23:10:39.629: INFO: Jan 14 23:10:41.632: INFO: 1 / 1 pods in namespace 'container-probe-205' are running and ready (14 seconds elapsed) Jan 14 23:10:41.632: INFO: expected 0 pod replicas in namespace 'container-probe-205', 0 are Running and Ready. [AfterEach] [sig-node] Probing container test/e2e/framework/framework.go:187 Jan 14 23:10:52.064: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:52.239: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:54.384: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:56.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:58.394: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:00.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:02.384: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:04.384: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:06.491: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:08.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:10.384: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:12.384: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:14.394: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:16.389: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:18.389: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:20.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:22.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:24.382: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:26.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:28.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:30.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:32.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:34.382: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:36.382: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:38.383: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:40.382: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:42.382: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:44.446: INFO: Condition Ready of node i-05871d1e8f8f620dd is false instead of true. Reason: KubeletNotReady, message: node is shutting down Jan 14 23:11:44.446: INFO: Condition Ready of node i-0dcc151940a349dcc is true, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:46.395: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "container-probe-205" for this suite. �[38;5;243m01/14/23 23:11:46.395�[0m �[1mSTEP:�[0m Collecting events from namespace "container-probe-205". �[38;5;243m01/14/23 23:11:46.549�[0m Jan 14 23:11:46.700: INFO: Unexpected error: failed to list events in namespace "container-probe-205": <*url.Error | 0xc002602a20>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-205/events", Err: <*net.OpError | 0xc00326c370>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0026029f0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc000112160>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.700: FAIL: failed to list events in namespace "container-probe-205": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/container-probe-205/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc003b0a278, {0xc00033f7e8, 0x13}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc002dee600}, {0xc00033f7e8, 0x13}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc00326ee40}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc000d56bd0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00008c460, 0xd5}, {0xc003b0b5a8?, 0x735bfcc?, 0xc003b0b5d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc002dee600?}, {0xc003b0b890?, 0x7382efc?, 0xf?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0009cedc0) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-node\]\sSecurity\sContext\sshould\ssupport\spod\.Spec\.SecurityContext\.RunAsUser\sAnd\spod\.Spec\.SecurityContext\.RunAsGroup\s\[LinuxOnly\]\s\[Conformance\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-node] Security Context should support pod.Spec.SecurityContext.RunAsUser And pod.Spec.SecurityContext.RunAsGroup [LinuxOnly] [Conformance]","completed":2,"skipped":14,"failed":1,"failures":["[sig-node] Security Context should support pod.Spec.SecurityContext.RunAsUser And pod.Spec.SecurityContext.RunAsGroup [LinuxOnly] [Conformance]"]} [BeforeEach] [sig-node] Security Context test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:31.546�[0m Jan 14 23:10:31.546: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename security-context �[38;5;243m01/14/23 23:10:31.547�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:32.009�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:32.293�[0m [It] should support pod.Spec.SecurityContext.RunAsUser And pod.Spec.SecurityContext.RunAsGroup [LinuxOnly] [Conformance] test/e2e/node/security_context.go:97 �[1mSTEP:�[0m Creating a pod to test pod.Spec.SecurityContext.RunAsUser �[38;5;243m01/14/23 23:10:32.575�[0m Jan 14 23:10:32.721: INFO: Waiting up to 5m0s for pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466" in namespace "security-context-2936" to be "Succeeded or Failed" Jan 14 23:10:32.863: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 141.585088ms Jan 14 23:10:35.005: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 2.28398899s Jan 14 23:10:37.005: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 4.284069356s Jan 14 23:10:39.006: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 6.2842742s Jan 14 23:10:41.012: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 8.29049609s Jan 14 23:10:43.006: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 10.284464006s Jan 14 23:10:45.006: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 12.284697467s Jan 14 23:10:47.008: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Pending", Reason="", readiness=false. Elapsed: 14.286917144s Jan 14 23:10:49.006: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.284577396s �[1mSTEP:�[0m Saw pod success �[38;5;243m01/14/23 23:10:49.006�[0m Jan 14 23:10:49.006: INFO: Pod "security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466" satisfied condition "Succeeded or Failed" Jan 14 23:10:49.150: INFO: Trying to get logs from node i-066162bc3bb041a75 pod security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466 container test-container: <nil> �[1mSTEP:�[0m delete the pod �[38;5;243m01/14/23 23:10:49.295�[0m Jan 14 23:10:49.458: INFO: Waiting for pod security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466 to disappear Jan 14 23:10:49.601: INFO: Pod security-context-de21dfe7-b543-4035-ad4e-1df6b4f0c466 no longer exists [AfterEach] [sig-node] Security Context test/e2e/framework/framework.go:187 Jan 14 23:10:49.601: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:49.767: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:51.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:53.914: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:55.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:57.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:59.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:01.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:03.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:05.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:07.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:09.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:11.922: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:13.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:15.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:17.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:19.916: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:21.913: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:23.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:25.912: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:27.914: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:29.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:31.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:33.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:35.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:37.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:39.916: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:41.911: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:43.916: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:45.920: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "security-context-2936" for this suite. �[38;5;243m01/14/23 23:11:45.92�[0m �[1mSTEP:�[0m Collecting events from namespace "security-context-2936". �[38;5;243m01/14/23 23:11:46.076�[0m Jan 14 23:11:46.228: INFO: Unexpected error: failed to list events in namespace "security-context-2936": <*url.Error | 0xc00347e5a0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/security-context-2936/events", Err: <*net.OpError | 0xc001c90a00>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc00280d9e0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002b7ae40>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.228: FAIL: failed to list events in namespace "security-context-2936": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/security-context-2936/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002c02278, {0xc001dbc270, 0x15}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc00191b200}, {0xc001dbc270, 0x15}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc002d306c0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00032ed20}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00008c460, 0xd5}, {0xc002c035a8?, 0x735bfcc?, 0xc002c035d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc00191b200?}, {0xc002c03890?, 0x7389dab?, 0x10?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000f84840) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sCSI\sVolumes\s\[Driver\:\scsi\-hostpath\]\s\[Testpattern\:\sDynamic\sPV\s\(default\sfs\)\(allowExpansion\)\]\svolume\-expand\sshould\sresize\svolume\swhen\sPVC\sis\sedited\swhile\spod\sis\susing\sit$'
test/e2e/storage/testsuites/volume_expand.go:274 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func5() test/e2e/storage/testsuites/volume_expand.go:274 +0x3b3from junit_01.xml
{"msg":"FAILED [sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Dynamic PV (default fs)(allowExpansion)] volume-expand should resize volume when PVC is edited while pod is using it","completed":0,"skipped":5,"failed":1,"failures":["[sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Dynamic PV (default fs)(allowExpansion)] volume-expand should resize volume when PVC is edited while pod is using it"]} [BeforeEach] [Testpattern: Dynamic PV (default fs)(allowExpansion)] volume-expand test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Dynamic PV (default fs)(allowExpansion)] volume-expand test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.57�[0m Jan 14 23:10:07.570: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename volume-expand �[38;5;243m01/14/23 23:10:07.571�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.007�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.289�[0m [It] should resize volume when PVC is edited while pod is using it test/e2e/storage/testsuites/volume_expand.go:252 �[1mSTEP:�[0m Building a driver namespace object, basename volume-expand-7436 �[38;5;243m01/14/23 23:10:08.573�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:09.079�[0m �[1mSTEP:�[0m deploying csi-hostpath driver �[38;5;243m01/14/23 23:10:09.367�[0m Jan 14 23:10:10.184: INFO: creating *v1.ServiceAccount: volume-expand-7436-2724/csi-attacher Jan 14 23:10:10.336: INFO: creating *v1.ClusterRole: external-attacher-runner-volume-expand-7436 Jan 14 23:10:10.336: INFO: Define cluster role external-attacher-runner-volume-expand-7436 Jan 14 23:10:10.515: INFO: creating *v1.ClusterRoleBinding: csi-attacher-role-volume-expand-7436 Jan 14 23:10:10.670: INFO: creating *v1.Role: volume-expand-7436-2724/external-attacher-cfg-volume-expand-7436 Jan 14 23:10:10.814: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-attacher-role-cfg Jan 14 23:10:10.957: INFO: creating *v1.ServiceAccount: volume-expand-7436-2724/csi-provisioner Jan 14 23:10:11.101: INFO: creating *v1.ClusterRole: external-provisioner-runner-volume-expand-7436 Jan 14 23:10:11.101: INFO: Define cluster role external-provisioner-runner-volume-expand-7436 Jan 14 23:10:11.246: INFO: creating *v1.ClusterRoleBinding: csi-provisioner-role-volume-expand-7436 Jan 14 23:10:11.389: INFO: creating *v1.Role: volume-expand-7436-2724/external-provisioner-cfg-volume-expand-7436 Jan 14 23:10:11.533: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-provisioner-role-cfg Jan 14 23:10:11.677: INFO: creating *v1.ServiceAccount: volume-expand-7436-2724/csi-snapshotter Jan 14 23:10:11.823: INFO: creating *v1.ClusterRole: external-snapshotter-runner-volume-expand-7436 Jan 14 23:10:11.823: INFO: Define cluster role external-snapshotter-runner-volume-expand-7436 Jan 14 23:10:11.966: INFO: creating *v1.ClusterRoleBinding: csi-snapshotter-role-volume-expand-7436 Jan 14 23:10:12.109: INFO: creating *v1.Role: volume-expand-7436-2724/external-snapshotter-leaderelection-volume-expand-7436 Jan 14 23:10:12.251: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/external-snapshotter-leaderelection Jan 14 23:10:12.395: INFO: creating *v1.ServiceAccount: volume-expand-7436-2724/csi-external-health-monitor-controller Jan 14 23:10:12.540: INFO: creating *v1.ClusterRole: external-health-monitor-controller-runner-volume-expand-7436 Jan 14 23:10:12.540: INFO: Define cluster role external-health-monitor-controller-runner-volume-expand-7436 Jan 14 23:10:12.685: INFO: creating *v1.ClusterRoleBinding: csi-external-health-monitor-controller-role-volume-expand-7436 Jan 14 23:10:12.828: INFO: creating *v1.Role: volume-expand-7436-2724/external-health-monitor-controller-cfg-volume-expand-7436 Jan 14 23:10:12.972: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-external-health-monitor-controller-role-cfg Jan 14 23:10:13.116: INFO: creating *v1.ServiceAccount: volume-expand-7436-2724/csi-resizer Jan 14 23:10:13.259: INFO: creating *v1.ClusterRole: external-resizer-runner-volume-expand-7436 Jan 14 23:10:13.259: INFO: Define cluster role external-resizer-runner-volume-expand-7436 Jan 14 23:10:13.402: INFO: creating *v1.ClusterRoleBinding: csi-resizer-role-volume-expand-7436 Jan 14 23:10:13.545: INFO: creating *v1.Role: volume-expand-7436-2724/external-resizer-cfg-volume-expand-7436 Jan 14 23:10:13.689: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-resizer-role-cfg Jan 14 23:10:13.832: INFO: creating *v1.CSIDriver: csi-hostpath-volume-expand-7436 Jan 14 23:10:13.975: INFO: creating *v1.ServiceAccount: volume-expand-7436-2724/csi-hostpathplugin-sa Jan 14 23:10:14.118: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-attacher-cluster-role-volume-expand-7436 Jan 14 23:10:14.262: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-health-monitor-controller-cluster-role-volume-expand-7436 Jan 14 23:10:14.407: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-provisioner-cluster-role-volume-expand-7436 Jan 14 23:10:14.551: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-resizer-cluster-role-volume-expand-7436 Jan 14 23:10:14.694: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-snapshotter-cluster-role-volume-expand-7436 Jan 14 23:10:14.841: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-attacher-role Jan 14 23:10:14.985: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-health-monitor-controller-role Jan 14 23:10:15.127: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-provisioner-role Jan 14 23:10:15.271: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-resizer-role Jan 14 23:10:15.414: INFO: creating *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-snapshotter-role Jan 14 23:10:15.557: INFO: creating *v1.StatefulSet: volume-expand-7436-2724/csi-hostpathplugin Jan 14 23:10:15.708: INFO: creating *v1.ClusterRoleBinding: psp-csi-hostpath-role-volume-expand-7436 Jan 14 23:10:15.860: INFO: Creating resource for dynamic PV Jan 14 23:10:15.860: INFO: Using claimSize:1Gi, test suite supported size:{ 1Gi}, driver(csi-hostpath) supported size:{ 1Gi} �[1mSTEP:�[0m creating a StorageClass volume-expand-7436thb98 �[38;5;243m01/14/23 23:10:15.86�[0m �[1mSTEP:�[0m creating a claim �[38;5;243m01/14/23 23:10:16.009�[0m Jan 14 23:10:16.009: INFO: Warning: Making PVC: VolumeMode specified as invalid empty string, treating as nil Jan 14 23:10:16.163: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [csi-hostpathrtt9v] to have phase Bound Jan 14 23:10:16.305: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:18.448: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:20.593: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:22.736: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:24.879: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:27.021: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:29.164: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:31.307: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:33.449: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:35.593: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:37.736: INFO: PersistentVolumeClaim csi-hostpathrtt9v found but phase is Pending instead of Bound. Jan 14 23:10:39.878: INFO: PersistentVolumeClaim csi-hostpathrtt9v found and phase=Bound (23.715286142s) �[1mSTEP:�[0m Creating a pod with dynamically provisioned volume �[38;5;243m01/14/23 23:10:40.167�[0m Jan 14 23:10:40.331: INFO: Waiting up to 5m0s for pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f" in namespace "volume-expand-7436" to be "running" Jan 14 23:10:40.480: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Pending", Reason="", readiness=false. Elapsed: 149.282064ms Jan 14 23:10:42.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Pending", Reason="", readiness=false. Elapsed: 2.292660173s Jan 14 23:10:44.623: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Pending", Reason="", readiness=false. Elapsed: 4.29200679s Jan 14 23:10:46.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Pending", Reason="", readiness=false. Elapsed: 6.292402994s Jan 14 23:10:48.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Pending", Reason="", readiness=true. Elapsed: 8.293957862s Jan 14 23:10:50.628: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Pending", Reason="", readiness=true. Elapsed: 10.29677697s Jan 14 23:10:52.643: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 12.312056342s Jan 14 23:10:52.643: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:10:54.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 14.292523238s Jan 14 23:10:54.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:10:56.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.292822384s Jan 14 23:10:56.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:10:58.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 18.29358363s Jan 14 23:10:58.625: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:00.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.293517703s Jan 14 23:11:00.625: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:02.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 22.293295127s Jan 14 23:11:02.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:04.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 24.293195118s Jan 14 23:11:04.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:06.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 26.293158913s Jan 14 23:11:06.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:08.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 28.29324148s Jan 14 23:11:08.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:10.627: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 30.295976888s Jan 14 23:11:10.627: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:12.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 32.293645021s Jan 14 23:11:12.625: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:14.627: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 34.295959828s Jan 14 23:11:14.627: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:16.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 36.292977495s Jan 14 23:11:16.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:18.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 38.29435128s Jan 14 23:11:18.626: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:20.623: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 40.291914226s Jan 14 23:11:20.623: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:22.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 42.293067797s Jan 14 23:11:22.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:24.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 44.292717371s Jan 14 23:11:24.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:26.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 46.293448423s Jan 14 23:11:26.625: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:28.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 48.294063449s Jan 14 23:11:28.625: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:30.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 50.292586962s Jan 14 23:11:30.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:32.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 52.293054916s Jan 14 23:11:32.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:34.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 54.292783515s Jan 14 23:11:34.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:36.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 56.292673813s Jan 14 23:11:36.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:38.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 58.292783022s Jan 14 23:11:38.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:40.625: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 1m0.293539598s Jan 14 23:11:40.625: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:42.624: INFO: Pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f": Phase="Succeeded", Reason="", readiness=false. Elapsed: 1m2.293088823s Jan 14 23:11:42.624: INFO: Error evaluating pod condition running: pod ran to completion Jan 14 23:11:44.669: INFO: Encountered non-retryable error while getting pod volume-expand-7436/pod-340be748-7fa4-4166-8def-3fb1a6bde81f: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:44.670: INFO: Unexpected error: While creating pods for resizing: <*errors.errorString | 0xc000f9aa80>: { s: "pod \"pod-340be748-7fa4-4166-8def-3fb1a6bde81f\" is not Running: error while waiting for pod volume-expand-7436/pod-340be748-7fa4-4166-8def-3fb1a6bde81f to be running: Get \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f\": dial tcp 52.67.139.60:443: connect: connection refused", } Jan 14 23:11:44.670: FAIL: While creating pods for resizing: pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f" is not Running: error while waiting for pod volume-expand-7436/pod-340be748-7fa4-4166-8def-3fb1a6bde81f to be running: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func5() test/e2e/storage/testsuites/volume_expand.go:274 +0x3b3 Jan 14 23:11:44.670: INFO: Deleting pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f" in namespace "volume-expand-7436" Jan 14 23:11:44.842: INFO: Unexpected error: while cleaning up pod already deleted in resize test: <*errors.errorString | 0xc000f7ad30>: { s: "pod Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f\": dial tcp 52.67.139.60:443: connect: connection refused", } Jan 14 23:11:44.842: FAIL: while cleaning up pod already deleted in resize test: pod Delete API error: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func5.1() test/e2e/storage/testsuites/volume_expand.go:272 +0xae panic({0x6ea2520, 0xc00383ed00}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00384bc00}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc000694380, 0x1bd}, {0xc00398fa88?, 0x735bfcc?, 0xc00398faa8?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0006941c0, 0x1a8}, {0xc00398fb20?, 0xc0009b44e0?, 0xc00398fb48?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c2aa20, 0xc000f9aa80}, {0xc000f9aa90?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func5() test/e2e/storage/testsuites/volume_expand.go:274 +0x3b3 �[1mSTEP:�[0m Deleting pod �[38;5;243m01/14/23 23:11:44.842�[0m Jan 14 23:11:44.842: INFO: Deleting pod "pod-340be748-7fa4-4166-8def-3fb1a6bde81f" in namespace "volume-expand-7436" �[1mSTEP:�[0m Deleting pvc �[38;5;243m01/14/23 23:11:44.995�[0m Jan 14 23:11:44.995: INFO: Deleting PersistentVolumeClaim "csi-hostpathrtt9v" Jan 14 23:11:45.147: INFO: Waiting up to 5m0s for PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e to get deleted Jan 14 23:11:45.301: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.458: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.610: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.760: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.913: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.066: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused ERROR: get pod list in volume-expand-7436-2724: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436-2724/pods": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.482: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:36.635: INFO: Get persistent volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e in failed, ignoring for 5s: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/persistentvolumes/pvc-50ff77df-1350-4209-9c21-5d841517fd0e": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:42.386: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (57.238463866s) Jan 14 23:12:47.529: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m2.382063543s) Jan 14 23:12:52.675: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m7.527757358s) Jan 14 23:12:57.824: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m12.676287477s) Jan 14 23:13:02.972: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m17.824415964s) Jan 14 23:13:08.115: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m22.967586384s) Jan 14 23:13:13.260: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m28.112317753s) Jan 14 23:13:18.414: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m33.266306657s) Jan 14 23:13:23.557: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m38.409653366s) Jan 14 23:13:28.730: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m43.582515229s) Jan 14 23:13:33.872: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m48.724409959s) Jan 14 23:13:39.014: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m53.866871249s) Jan 14 23:13:44.157: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (1m59.009433966s) Jan 14 23:13:49.299: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m4.151847519s) Jan 14 23:13:54.446: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m9.298492116s) Jan 14 23:13:59.589: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m14.442060316s) Jan 14 23:14:04.732: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m19.584387936s) Jan 14 23:14:09.875: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m24.727327792s) Jan 14 23:14:15.018: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m29.87028502s) Jan 14 23:14:20.161: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m35.013410869s) Jan 14 23:14:25.304: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m40.156503341s) Jan 14 23:14:30.446: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m45.298715192s) Jan 14 23:14:35.588: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m50.440647439s) Jan 14 23:14:40.733: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (2m55.585234183s) Jan 14 23:14:45.876: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m0.728183099s) Jan 14 23:14:51.017: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m5.869885012s) Jan 14 23:14:56.160: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m11.012514359s) Jan 14 23:15:01.302: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m16.154712581s) Jan 14 23:15:06.445: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m21.297784908s) Jan 14 23:15:11.588: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m26.440528738s) Jan 14 23:15:16.733: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m31.585425905s) Jan 14 23:15:21.876: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m36.728722779s) Jan 14 23:15:27.019: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m41.871389745s) Jan 14 23:15:32.163: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m47.0153138s) Jan 14 23:15:37.305: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m52.157978995s) Jan 14 23:15:42.448: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (3m57.300690248s) Jan 14 23:15:47.591: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m2.443731165s) Jan 14 23:15:52.734: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m7.586708719s) Jan 14 23:15:57.876: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m12.729175894s) Jan 14 23:16:03.019: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m17.871870763s) Jan 14 23:16:08.162: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m23.014398505s) Jan 14 23:16:13.304: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m28.156831517s) Jan 14 23:16:18.446: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m33.299003656s) Jan 14 23:16:23.589: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m38.441617917s) Jan 14 23:16:28.731: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m43.584141693s) Jan 14 23:16:33.873: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m48.725889304s) Jan 14 23:16:39.016: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m53.869152592s) Jan 14 23:16:44.160: INFO: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e found and phase=Bound (4m59.012810646s) �[1mSTEP:�[0m Deleting sc �[38;5;243m01/14/23 23:16:49.161�[0m �[1mSTEP:�[0m deleting the test namespace: volume-expand-7436 �[38;5;243m01/14/23 23:16:49.306�[0m �[1mSTEP:�[0m Waiting for namespaces [volume-expand-7436] to vanish �[38;5;243m01/14/23 23:16:49.45�[0m �[1mSTEP:�[0m uninstalling csi csi-hostpath driver �[38;5;243m01/14/23 23:17:01.596�[0m Jan 14 23:17:01.596: INFO: deleting *v1.ServiceAccount: volume-expand-7436-2724/csi-attacher Jan 14 23:17:01.740: INFO: deleting *v1.ClusterRole: external-attacher-runner-volume-expand-7436 Jan 14 23:17:01.885: INFO: deleting *v1.ClusterRoleBinding: csi-attacher-role-volume-expand-7436 Jan 14 23:17:02.040: INFO: deleting *v1.Role: volume-expand-7436-2724/external-attacher-cfg-volume-expand-7436 Jan 14 23:17:02.184: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-attacher-role-cfg Jan 14 23:17:02.327: INFO: deleting *v1.ServiceAccount: volume-expand-7436-2724/csi-provisioner Jan 14 23:17:02.471: INFO: deleting *v1.ClusterRole: external-provisioner-runner-volume-expand-7436 Jan 14 23:17:02.619: INFO: deleting *v1.ClusterRoleBinding: csi-provisioner-role-volume-expand-7436 Jan 14 23:17:02.762: INFO: deleting *v1.Role: volume-expand-7436-2724/external-provisioner-cfg-volume-expand-7436 Jan 14 23:17:02.905: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-provisioner-role-cfg Jan 14 23:17:03.055: INFO: deleting *v1.ServiceAccount: volume-expand-7436-2724/csi-snapshotter Jan 14 23:17:03.199: INFO: deleting *v1.ClusterRole: external-snapshotter-runner-volume-expand-7436 Jan 14 23:17:03.346: INFO: deleting *v1.ClusterRoleBinding: csi-snapshotter-role-volume-expand-7436 Jan 14 23:17:03.491: INFO: deleting *v1.Role: volume-expand-7436-2724/external-snapshotter-leaderelection-volume-expand-7436 Jan 14 23:17:03.634: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/external-snapshotter-leaderelection Jan 14 23:17:03.778: INFO: deleting *v1.ServiceAccount: volume-expand-7436-2724/csi-external-health-monitor-controller Jan 14 23:17:03.928: INFO: deleting *v1.ClusterRole: external-health-monitor-controller-runner-volume-expand-7436 Jan 14 23:17:04.072: INFO: deleting *v1.ClusterRoleBinding: csi-external-health-monitor-controller-role-volume-expand-7436 Jan 14 23:17:04.215: INFO: deleting *v1.Role: volume-expand-7436-2724/external-health-monitor-controller-cfg-volume-expand-7436 Jan 14 23:17:04.358: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-external-health-monitor-controller-role-cfg Jan 14 23:17:04.501: INFO: deleting *v1.ServiceAccount: volume-expand-7436-2724/csi-resizer Jan 14 23:17:04.647: INFO: deleting *v1.ClusterRole: external-resizer-runner-volume-expand-7436 Jan 14 23:17:04.790: INFO: deleting *v1.ClusterRoleBinding: csi-resizer-role-volume-expand-7436 Jan 14 23:17:04.934: INFO: deleting *v1.Role: volume-expand-7436-2724/external-resizer-cfg-volume-expand-7436 Jan 14 23:17:05.079: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-resizer-role-cfg Jan 14 23:17:05.222: INFO: deleting *v1.CSIDriver: csi-hostpath-volume-expand-7436 Jan 14 23:17:05.366: INFO: deleting *v1.ServiceAccount: volume-expand-7436-2724/csi-hostpathplugin-sa Jan 14 23:17:05.509: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-attacher-cluster-role-volume-expand-7436 Jan 14 23:17:05.653: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-health-monitor-controller-cluster-role-volume-expand-7436 Jan 14 23:17:05.796: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-provisioner-cluster-role-volume-expand-7436 Jan 14 23:17:05.940: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-resizer-cluster-role-volume-expand-7436 Jan 14 23:17:06.085: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-snapshotter-cluster-role-volume-expand-7436 Jan 14 23:17:06.228: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-attacher-role Jan 14 23:17:06.377: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-health-monitor-controller-role Jan 14 23:17:06.522: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-provisioner-role Jan 14 23:17:06.667: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-resizer-role Jan 14 23:17:06.810: INFO: deleting *v1.RoleBinding: volume-expand-7436-2724/csi-hostpathplugin-snapshotter-role Jan 14 23:17:06.953: INFO: deleting *v1.StatefulSet: volume-expand-7436-2724/csi-hostpathplugin Jan 14 23:17:07.096: INFO: deleting *v1.ClusterRoleBinding: psp-csi-hostpath-role-volume-expand-7436 �[1mSTEP:�[0m deleting the driver namespace: volume-expand-7436-2724 �[38;5;243m01/14/23 23:17:07.24�[0m �[1mSTEP:�[0m Waiting for namespaces [volume-expand-7436-2724] to vanish �[38;5;243m01/14/23 23:17:07.387�[0m Jan 14 23:17:13.532: INFO: Unexpected error: while cleaning up resource: <errors.aggregate | len:2, cap:2>: [ <*errors.errorString | 0xc000f2b010>{ s: "pod Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f\": dial tcp 52.67.139.60:443: connect: connection refused", }, <errors.aggregate | len:2, cap:2>[ <*fmt.wrapError | 0xc0032cd000>{ msg: "failed to delete PVC csi-hostpathrtt9v: PVC Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/persistentvolumeclaims/csi-hostpathrtt9v\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*errors.errorString | 0xc000f2b350>{ s: "PVC Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/persistentvolumeclaims/csi-hostpathrtt9v\": dial tcp 52.67.139.60:443: connect: connection refused", }, }, <*fmt.wrapError | 0xc0012b7180>{ msg: "persistent Volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e not deleted by dynamic provisioner: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e still exists within 5m0s", err: <*errors.errorString | 0xc0012588d0>{ s: "PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e still exists within 5m0s", }, }, ], ] Jan 14 23:17:13.532: FAIL: while cleaning up resource: [pod Delete API error: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/pods/pod-340be748-7fa4-4166-8def-3fb1a6bde81f": dial tcp 52.67.139.60:443: connect: connection refused, failed to delete PVC csi-hostpathrtt9v: PVC Delete API error: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-expand-7436/persistentvolumeclaims/csi-hostpathrtt9v": dial tcp 52.67.139.60:443: connect: connection refused, persistent Volume pvc-50ff77df-1350-4209-9c21-5d841517fd0e not deleted by dynamic provisioner: PersistentVolume pvc-50ff77df-1350-4209-9c21-5d841517fd0e still exists within 5m0s] Full Stack Trace k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func2() test/e2e/storage/testsuites/volume_expand.go:154 +0x49a panic({0x6ea2520, 0xc001dab500}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc000a7b180}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0000fcdc0, 0x144}, {0xc00398f418?, 0x735bfcc?, 0xc00398f438?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc001f8e140, 0x12f}, {0xc00398f4b0?, 0xc00002b900?, 0xc00398f4d8?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c2aa20, 0xc000f7ad30}, {0xc000f7ad40?, 0x2723bec?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func5.1() test/e2e/storage/testsuites/volume_expand.go:272 +0xae panic({0x6ea2520, 0xc00383ed00}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00384bc00}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc000694380, 0x1bd}, {0xc00398fa88?, 0x735bfcc?, 0xc00398faa8?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0006941c0, 0x1a8}, {0xc00398fb20?, 0xc0009b44e0?, 0xc00398fb48?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c2aa20, 0xc000f9aa80}, {0xc000f9aa90?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage/testsuites.(*volumeExpandTestSuite).DefineTests.func5() test/e2e/storage/testsuites/volume_expand.go:274 +0x3b3 [AfterEach] [Testpattern: Dynamic PV (default fs)(allowExpansion)] volume-expand test/e2e/framework/framework.go:187 Jan 14 23:17:13.533: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sCSI\sVolumes\s\[Driver\:\scsi\-hostpath\]\s\[Testpattern\:\sDynamic\sPV\s\(default\sfs\)\]\ssubPath\sshould\ssupport\sreadOnly\sfile\sspecified\sin\sthe\svolumeMount\s\[LinuxOnly\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Dynamic PV (default fs)] subPath should support readOnly file specified in the volumeMount [LinuxOnly]","completed":1,"skipped":1,"failed":1,"failures":["[sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Dynamic PV (default fs)] subPath should support readOnly file specified in the volumeMount [LinuxOnly]"]} [BeforeEach] [Testpattern: Dynamic PV (default fs)] subPath test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Dynamic PV (default fs)] subPath test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:11.012�[0m Jan 14 23:10:11.012: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename provisioning �[38;5;243m01/14/23 23:10:11.013�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:11.441�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:11.723�[0m [It] should support readOnly file specified in the volumeMount [LinuxOnly] test/e2e/storage/testsuites/subpath.go:381 �[1mSTEP:�[0m Building a driver namespace object, basename provisioning-9420 �[38;5;243m01/14/23 23:10:12.004�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:12.435�[0m �[1mSTEP:�[0m deploying csi-hostpath driver �[38;5;243m01/14/23 23:10:12.718�[0m Jan 14 23:10:13.294: INFO: creating *v1.ServiceAccount: provisioning-9420-965/csi-attacher Jan 14 23:10:13.437: INFO: creating *v1.ClusterRole: external-attacher-runner-provisioning-9420 Jan 14 23:10:13.437: INFO: Define cluster role external-attacher-runner-provisioning-9420 Jan 14 23:10:13.580: INFO: creating *v1.ClusterRoleBinding: csi-attacher-role-provisioning-9420 Jan 14 23:10:13.723: INFO: creating *v1.Role: provisioning-9420-965/external-attacher-cfg-provisioning-9420 Jan 14 23:10:13.866: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-attacher-role-cfg Jan 14 23:10:14.010: INFO: creating *v1.ServiceAccount: provisioning-9420-965/csi-provisioner Jan 14 23:10:14.155: INFO: creating *v1.ClusterRole: external-provisioner-runner-provisioning-9420 Jan 14 23:10:14.155: INFO: Define cluster role external-provisioner-runner-provisioning-9420 Jan 14 23:10:14.302: INFO: creating *v1.ClusterRoleBinding: csi-provisioner-role-provisioning-9420 Jan 14 23:10:14.445: INFO: creating *v1.Role: provisioning-9420-965/external-provisioner-cfg-provisioning-9420 Jan 14 23:10:14.596: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-provisioner-role-cfg Jan 14 23:10:14.739: INFO: creating *v1.ServiceAccount: provisioning-9420-965/csi-snapshotter Jan 14 23:10:14.887: INFO: creating *v1.ClusterRole: external-snapshotter-runner-provisioning-9420 Jan 14 23:10:14.887: INFO: Define cluster role external-snapshotter-runner-provisioning-9420 Jan 14 23:10:15.030: INFO: creating *v1.ClusterRoleBinding: csi-snapshotter-role-provisioning-9420 Jan 14 23:10:15.173: INFO: creating *v1.Role: provisioning-9420-965/external-snapshotter-leaderelection-provisioning-9420 Jan 14 23:10:15.317: INFO: creating *v1.RoleBinding: provisioning-9420-965/external-snapshotter-leaderelection Jan 14 23:10:15.464: INFO: creating *v1.ServiceAccount: provisioning-9420-965/csi-external-health-monitor-controller Jan 14 23:10:15.607: INFO: creating *v1.ClusterRole: external-health-monitor-controller-runner-provisioning-9420 Jan 14 23:10:15.607: INFO: Define cluster role external-health-monitor-controller-runner-provisioning-9420 Jan 14 23:10:15.753: INFO: creating *v1.ClusterRoleBinding: csi-external-health-monitor-controller-role-provisioning-9420 Jan 14 23:10:15.896: INFO: creating *v1.Role: provisioning-9420-965/external-health-monitor-controller-cfg-provisioning-9420 Jan 14 23:10:16.044: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-external-health-monitor-controller-role-cfg Jan 14 23:10:16.189: INFO: creating *v1.ServiceAccount: provisioning-9420-965/csi-resizer Jan 14 23:10:16.332: INFO: creating *v1.ClusterRole: external-resizer-runner-provisioning-9420 Jan 14 23:10:16.332: INFO: Define cluster role external-resizer-runner-provisioning-9420 Jan 14 23:10:16.481: INFO: creating *v1.ClusterRoleBinding: csi-resizer-role-provisioning-9420 Jan 14 23:10:16.625: INFO: creating *v1.Role: provisioning-9420-965/external-resizer-cfg-provisioning-9420 Jan 14 23:10:16.768: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-resizer-role-cfg Jan 14 23:10:16.911: INFO: creating *v1.CSIDriver: csi-hostpath-provisioning-9420 Jan 14 23:10:17.054: INFO: creating *v1.ServiceAccount: provisioning-9420-965/csi-hostpathplugin-sa Jan 14 23:10:17.198: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-attacher-cluster-role-provisioning-9420 Jan 14 23:10:17.342: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-health-monitor-controller-cluster-role-provisioning-9420 Jan 14 23:10:17.485: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-provisioner-cluster-role-provisioning-9420 Jan 14 23:10:17.630: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-resizer-cluster-role-provisioning-9420 Jan 14 23:10:17.773: INFO: creating *v1.ClusterRoleBinding: csi-hostpathplugin-snapshotter-cluster-role-provisioning-9420 Jan 14 23:10:17.916: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-attacher-role Jan 14 23:10:18.059: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-health-monitor-controller-role Jan 14 23:10:18.202: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-provisioner-role Jan 14 23:10:18.348: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-resizer-role Jan 14 23:10:18.491: INFO: creating *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-snapshotter-role Jan 14 23:10:18.634: INFO: creating *v1.StatefulSet: provisioning-9420-965/csi-hostpathplugin Jan 14 23:10:18.780: INFO: creating *v1.ClusterRoleBinding: psp-csi-hostpath-role-provisioning-9420 Jan 14 23:10:18.934: INFO: Creating resource for dynamic PV Jan 14 23:10:18.934: INFO: Using claimSize:1Mi, test suite supported size:{ 1Mi}, driver(csi-hostpath) supported size:{ 1Mi} �[1mSTEP:�[0m creating a StorageClass provisioning-9420xknpw �[38;5;243m01/14/23 23:10:18.934�[0m �[1mSTEP:�[0m creating a claim �[38;5;243m01/14/23 23:10:19.078�[0m Jan 14 23:10:19.078: INFO: Warning: Making PVC: VolumeMode specified as invalid empty string, treating as nil Jan 14 23:10:19.259: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [csi-hostpathn4xnd] to have phase Bound Jan 14 23:10:19.402: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:21.544: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:23.686: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:25.832: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:27.974: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:30.117: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:32.262: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:34.403: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:36.546: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:38.707: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:40.850: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:42.993: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:45.141: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:47.284: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:49.427: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:51.571: INFO: PersistentVolumeClaim csi-hostpathn4xnd found but phase is Pending instead of Bound. Jan 14 23:10:53.713: INFO: PersistentVolumeClaim csi-hostpathn4xnd found and phase=Bound (34.454094837s) �[1mSTEP:�[0m Creating pod pod-subpath-test-dynamicpv-wpmv �[38;5;243m01/14/23 23:10:53.996�[0m �[1mSTEP:�[0m Creating a pod to test subpath �[38;5;243m01/14/23 23:10:53.996�[0m Jan 14 23:10:54.148: INFO: Waiting up to 5m0s for pod "pod-subpath-test-dynamicpv-wpmv" in namespace "provisioning-9420" to be "Succeeded or Failed" Jan 14 23:10:54.290: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 142.232009ms Jan 14 23:10:56.433: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 2.285238081s Jan 14 23:10:58.433: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 4.284512504s Jan 14 23:11:00.434: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 6.285927992s Jan 14 23:11:02.433: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 8.285138637s Jan 14 23:11:04.444: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 10.295747621s Jan 14 23:11:06.477: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Pending", Reason="", readiness=false. Elapsed: 12.328636129s Jan 14 23:11:08.434: INFO: Pod "pod-subpath-test-dynamicpv-wpmv": Phase="Succeeded", Reason="", readiness=false. Elapsed: 14.286139478s �[1mSTEP:�[0m Saw pod success �[38;5;243m01/14/23 23:11:08.434�[0m Jan 14 23:11:08.434: INFO: Pod "pod-subpath-test-dynamicpv-wpmv" satisfied condition "Succeeded or Failed" Jan 14 23:11:08.577: INFO: Trying to get logs from node i-0fcd1e4b56ac2b41b pod pod-subpath-test-dynamicpv-wpmv container test-container-subpath-dynamicpv-wpmv: <nil> �[1mSTEP:�[0m delete the pod �[38;5;243m01/14/23 23:11:08.722�[0m Jan 14 23:11:08.871: INFO: Waiting for pod pod-subpath-test-dynamicpv-wpmv to disappear Jan 14 23:11:09.013: INFO: Pod pod-subpath-test-dynamicpv-wpmv no longer exists �[1mSTEP:�[0m Deleting pod pod-subpath-test-dynamicpv-wpmv �[38;5;243m01/14/23 23:11:09.013�[0m Jan 14 23:11:09.013: INFO: Deleting pod "pod-subpath-test-dynamicpv-wpmv" in namespace "provisioning-9420" �[1mSTEP:�[0m Deleting pod �[38;5;243m01/14/23 23:11:09.157�[0m Jan 14 23:11:09.157: INFO: Deleting pod "pod-subpath-test-dynamicpv-wpmv" in namespace "provisioning-9420" �[1mSTEP:�[0m Deleting pvc �[38;5;243m01/14/23 23:11:09.299�[0m Jan 14 23:11:09.299: INFO: Deleting PersistentVolumeClaim "csi-hostpathn4xnd" Jan 14 23:11:09.447: INFO: Waiting up to 5m0s for PersistentVolume pvc-03631b47-0750-48b7-8384-c5c7e415e1a6 to get deleted Jan 14 23:11:09.589: INFO: PersistentVolume pvc-03631b47-0750-48b7-8384-c5c7e415e1a6 was removed �[1mSTEP:�[0m Deleting sc �[38;5;243m01/14/23 23:11:09.589�[0m �[1mSTEP:�[0m deleting the test namespace: provisioning-9420 �[38;5;243m01/14/23 23:11:09.735�[0m �[1mSTEP:�[0m Waiting for namespaces [provisioning-9420] to vanish �[38;5;243m01/14/23 23:11:09.884�[0m �[1mSTEP:�[0m uninstalling csi csi-hostpath driver �[38;5;243m01/14/23 23:11:16.029�[0m Jan 14 23:11:16.029: INFO: deleting *v1.ServiceAccount: provisioning-9420-965/csi-attacher Jan 14 23:11:16.173: INFO: deleting *v1.ClusterRole: external-attacher-runner-provisioning-9420 Jan 14 23:11:16.319: INFO: deleting *v1.ClusterRoleBinding: csi-attacher-role-provisioning-9420 Jan 14 23:11:16.469: INFO: deleting *v1.Role: provisioning-9420-965/external-attacher-cfg-provisioning-9420 Jan 14 23:11:16.615: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-attacher-role-cfg Jan 14 23:11:16.759: INFO: deleting *v1.ServiceAccount: provisioning-9420-965/csi-provisioner Jan 14 23:11:16.905: INFO: deleting *v1.ClusterRole: external-provisioner-runner-provisioning-9420 Jan 14 23:11:17.052: INFO: deleting *v1.ClusterRoleBinding: csi-provisioner-role-provisioning-9420 Jan 14 23:11:17.197: INFO: deleting *v1.Role: provisioning-9420-965/external-provisioner-cfg-provisioning-9420 Jan 14 23:11:17.342: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-provisioner-role-cfg Jan 14 23:11:17.487: INFO: deleting *v1.ServiceAccount: provisioning-9420-965/csi-snapshotter Jan 14 23:11:17.631: INFO: deleting *v1.ClusterRole: external-snapshotter-runner-provisioning-9420 Jan 14 23:11:17.780: INFO: deleting *v1.ClusterRoleBinding: csi-snapshotter-role-provisioning-9420 Jan 14 23:11:17.927: INFO: deleting *v1.Role: provisioning-9420-965/external-snapshotter-leaderelection-provisioning-9420 Jan 14 23:11:18.073: INFO: deleting *v1.RoleBinding: provisioning-9420-965/external-snapshotter-leaderelection Jan 14 23:11:18.219: INFO: deleting *v1.ServiceAccount: provisioning-9420-965/csi-external-health-monitor-controller Jan 14 23:11:18.364: INFO: deleting *v1.ClusterRole: external-health-monitor-controller-runner-provisioning-9420 Jan 14 23:11:18.513: INFO: deleting *v1.ClusterRoleBinding: csi-external-health-monitor-controller-role-provisioning-9420 Jan 14 23:11:18.662: INFO: deleting *v1.Role: provisioning-9420-965/external-health-monitor-controller-cfg-provisioning-9420 Jan 14 23:11:18.826: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-external-health-monitor-controller-role-cfg Jan 14 23:11:18.971: INFO: deleting *v1.ServiceAccount: provisioning-9420-965/csi-resizer Jan 14 23:11:19.117: INFO: deleting *v1.ClusterRole: external-resizer-runner-provisioning-9420 Jan 14 23:11:19.262: INFO: deleting *v1.ClusterRoleBinding: csi-resizer-role-provisioning-9420 Jan 14 23:11:19.410: INFO: deleting *v1.Role: provisioning-9420-965/external-resizer-cfg-provisioning-9420 Jan 14 23:11:19.555: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-resizer-role-cfg Jan 14 23:11:19.703: INFO: deleting *v1.CSIDriver: csi-hostpath-provisioning-9420 Jan 14 23:11:19.851: INFO: deleting *v1.ServiceAccount: provisioning-9420-965/csi-hostpathplugin-sa Jan 14 23:11:20.001: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-attacher-cluster-role-provisioning-9420 Jan 14 23:11:20.148: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-health-monitor-controller-cluster-role-provisioning-9420 Jan 14 23:11:20.292: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-provisioner-cluster-role-provisioning-9420 Jan 14 23:11:20.439: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-resizer-cluster-role-provisioning-9420 Jan 14 23:11:20.588: INFO: deleting *v1.ClusterRoleBinding: csi-hostpathplugin-snapshotter-cluster-role-provisioning-9420 Jan 14 23:11:20.733: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-attacher-role Jan 14 23:11:20.879: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-health-monitor-controller-role Jan 14 23:11:21.025: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-provisioner-role Jan 14 23:11:21.172: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-resizer-role Jan 14 23:11:21.319: INFO: deleting *v1.RoleBinding: provisioning-9420-965/csi-hostpathplugin-snapshotter-role Jan 14 23:11:21.465: INFO: deleting *v1.StatefulSet: provisioning-9420-965/csi-hostpathplugin Jan 14 23:11:21.611: INFO: deleting *v1.ClusterRoleBinding: psp-csi-hostpath-role-provisioning-9420 �[1mSTEP:�[0m deleting the driver namespace: provisioning-9420-965 �[38;5;243m01/14/23 23:11:21.755�[0m �[1mSTEP:�[0m Waiting for namespaces [provisioning-9420-965] to vanish �[38;5;243m01/14/23 23:11:21.941�[0m [AfterEach] [Testpattern: Dynamic PV (default fs)] subPath test/e2e/framework/framework.go:187 Jan 14 23:11:28.085: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:11:28.230: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:30.374: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:32.377: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:34.373: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:36.373: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:38.373: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:40.374: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:42.373: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:44.447: INFO: Condition Ready of node i-05871d1e8f8f620dd is false instead of true. Reason: KubeletNotReady, message: node is shutting down Jan 14 23:11:44.447: INFO: Condition Ready of node i-0dcc151940a349dcc is true, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:46.388: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sCSI\sVolumes\s\[Driver\:\scsi\-hostpath\]\s\[Testpattern\:\sGeneric\sEphemeral\-volume\s\(default\sfs\)\s\(immediate\-binding\)\]\sephemeral\sshould\ssupport\stwo\spods\swhich\shave\sthe\ssame\svolume\sdefinition$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000ee1340) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral should support two pods which have the same volume definition","completed":1,"skipped":25,"failed":2,"failures":["[sig-storage] PersistentVolumes-expansion loopback local block volume should support online expansion on node","[sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral should support two pods which have the same volume definition"]} [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.653�[0m Jan 14 23:11:45.653: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename ephemeral �[38;5;243m01/14/23 23:11:45.654�[0m Jan 14 23:11:45.805: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.959: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.958: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.961: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.960: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.958: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.958: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.973: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.958: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.959: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.959: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.957: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.960: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.958: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.958: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.226: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.380: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.380: INFO: Unexpected error: <*errors.errorString | 0xc0001eb920>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.380: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000ee1340) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [Testpattern: Generic Ephemeral-volume (default fs) (immediate-binding)] ephemeral test/e2e/framework/framework.go:187 Jan 14 23:12:31.380: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.534: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sCSI\sVolumes\s\[Driver\:\scsi\-hostpath\]\s\[Testpattern\:\sGeneric\sEphemeral\-volume\s\(default\sfs\)\s\(late\-binding\)\]\sephemeral\sshould\screate\sread\-only\sinline\sephemeral\svolume$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000db1340) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral should create read-only inline ephemeral volume","completed":1,"skipped":15,"failed":2,"failures":["[sig-storage] In-tree Volumes [Driver: emptydir] [Testpattern: Inline-volume (default fs)] volumes should store data","[sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral should create read-only inline ephemeral volume"]} [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.286�[0m Jan 14 23:11:46.286: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename ephemeral �[38;5;243m01/14/23 23:11:46.287�[0m Jan 14 23:11:46.441: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.592: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.593: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.595: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.592: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.594: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.594: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.598: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.594: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.596: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.594: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.608: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.596: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.593: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.626: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.993: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.145: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.145: INFO: Unexpected error: <*errors.errorString | 0xc00016b920>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.145: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000db1340) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [Testpattern: Generic Ephemeral-volume (default fs) (late-binding)] ephemeral test/e2e/framework/framework.go:187 Jan 14 23:12:32.145: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.299: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sCSI\smock\svolume\sCSI\sVolume\sexpansion\sshould\sexpand\svolume\sby\srestarting\spod\sif\sattach\=on\,\snodeExpansion\=on$'
test/e2e/storage/csi_mock_volume.go:690 k8s.io/kubernetes/test/e2e/storage.glob..func2.10.1() test/e2e/storage/csi_mock_volume.go:690 +0x32afrom junit_01.xml
{"msg":"FAILED [sig-storage] CSI mock volume CSI Volume expansion should expand volume by restarting pod if attach=on, nodeExpansion=on","completed":0,"skipped":1,"failed":1,"failures":["[sig-storage] CSI mock volume CSI Volume expansion should expand volume by restarting pod if attach=on, nodeExpansion=on"]} [BeforeEach] [sig-storage] CSI mock volume test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:07.452�[0m Jan 14 23:10:07.453: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename csi-mock-volumes �[38;5;243m01/14/23 23:10:07.454�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:07.884�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.166�[0m [It] should expand volume by restarting pod if attach=on, nodeExpansion=on test/e2e/storage/csi_mock_volume.go:668 �[1mSTEP:�[0m Building a driver namespace object, basename csi-mock-volumes-5501 �[38;5;243m01/14/23 23:10:08.454�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:08.91�[0m �[1mSTEP:�[0m deploying csi mock driver �[38;5;243m01/14/23 23:10:09.201�[0m Jan 14 23:10:10.048: INFO: creating *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-attacher Jan 14 23:10:10.192: INFO: creating *v1.ClusterRole: external-attacher-runner-csi-mock-volumes-5501 Jan 14 23:10:10.192: INFO: Define cluster role external-attacher-runner-csi-mock-volumes-5501 Jan 14 23:10:10.337: INFO: creating *v1.ClusterRoleBinding: csi-attacher-role-csi-mock-volumes-5501 Jan 14 23:10:10.510: INFO: creating *v1.Role: csi-mock-volumes-5501-6615/external-attacher-cfg-csi-mock-volumes-5501 Jan 14 23:10:10.671: INFO: creating *v1.RoleBinding: csi-mock-volumes-5501-6615/csi-attacher-role-cfg Jan 14 23:10:10.814: INFO: creating *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-provisioner Jan 14 23:10:10.959: INFO: creating *v1.ClusterRole: external-provisioner-runner-csi-mock-volumes-5501 Jan 14 23:10:10.959: INFO: Define cluster role external-provisioner-runner-csi-mock-volumes-5501 Jan 14 23:10:11.103: INFO: creating *v1.ClusterRoleBinding: csi-provisioner-role-csi-mock-volumes-5501 Jan 14 23:10:11.247: INFO: creating *v1.Role: csi-mock-volumes-5501-6615/external-provisioner-cfg-csi-mock-volumes-5501 Jan 14 23:10:11.390: INFO: creating *v1.RoleBinding: csi-mock-volumes-5501-6615/csi-provisioner-role-cfg Jan 14 23:10:11.533: INFO: creating *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-resizer Jan 14 23:10:11.676: INFO: creating *v1.ClusterRole: external-resizer-runner-csi-mock-volumes-5501 Jan 14 23:10:11.676: INFO: Define cluster role external-resizer-runner-csi-mock-volumes-5501 Jan 14 23:10:11.819: INFO: creating *v1.ClusterRoleBinding: csi-resizer-role-csi-mock-volumes-5501 Jan 14 23:10:11.962: INFO: creating *v1.Role: csi-mock-volumes-5501-6615/external-resizer-cfg-csi-mock-volumes-5501 Jan 14 23:10:12.105: INFO: creating *v1.RoleBinding: csi-mock-volumes-5501-6615/csi-resizer-role-cfg Jan 14 23:10:12.249: INFO: creating *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-snapshotter Jan 14 23:10:12.392: INFO: creating *v1.ClusterRole: external-snapshotter-runner-csi-mock-volumes-5501 Jan 14 23:10:12.392: INFO: Define cluster role external-snapshotter-runner-csi-mock-volumes-5501 Jan 14 23:10:12.537: INFO: creating *v1.ClusterRoleBinding: csi-snapshotter-role-csi-mock-volumes-5501 Jan 14 23:10:12.680: INFO: creating *v1.Role: csi-mock-volumes-5501-6615/external-snapshotter-leaderelection-csi-mock-volumes-5501 Jan 14 23:10:12.823: INFO: creating *v1.RoleBinding: csi-mock-volumes-5501-6615/external-snapshotter-leaderelection Jan 14 23:10:12.966: INFO: creating *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-mock Jan 14 23:10:13.109: INFO: creating *v1.ClusterRoleBinding: csi-controller-attacher-role-csi-mock-volumes-5501 Jan 14 23:10:13.252: INFO: creating *v1.ClusterRoleBinding: csi-controller-provisioner-role-csi-mock-volumes-5501 Jan 14 23:10:13.398: INFO: creating *v1.ClusterRoleBinding: csi-controller-cluster-driver-registrar-role-csi-mock-volumes-5501 Jan 14 23:10:13.541: INFO: creating *v1.ClusterRoleBinding: psp-csi-controller-driver-registrar-role-csi-mock-volumes-5501 Jan 14 23:10:13.684: INFO: creating *v1.ClusterRoleBinding: csi-controller-resizer-role-csi-mock-volumes-5501 Jan 14 23:10:13.827: INFO: creating *v1.ClusterRoleBinding: csi-controller-snapshotter-role-csi-mock-volumes-5501 Jan 14 23:10:13.972: INFO: creating *v1.StorageClass: csi-mock-sc-csi-mock-volumes-5501 Jan 14 23:10:14.115: INFO: creating *v1.StatefulSet: csi-mock-volumes-5501-6615/csi-mockplugin Jan 14 23:10:14.261: INFO: creating *v1.StatefulSet: csi-mock-volumes-5501-6615/csi-mockplugin-attacher Jan 14 23:10:14.423: INFO: creating *v1.StatefulSet: csi-mock-volumes-5501-6615/csi-mockplugin-resizer Jan 14 23:10:14.566: INFO: waiting for CSIDriver csi-mock-csi-mock-volumes-5501 to register on node i-07f0d0bc50c0f4aa8 �[1mSTEP:�[0m Creating pod �[38;5;243m01/14/23 23:10:57.133�[0m Jan 14 23:10:57.279: INFO: Warning: Making PVC: VolumeMode specified as invalid empty string, treating as nil Jan 14 23:10:57.424: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-2xkhm] to have phase Bound Jan 14 23:10:57.567: INFO: PersistentVolumeClaim pvc-2xkhm found and phase=Bound (142.808518ms) Jan 14 23:10:57.997: INFO: Waiting up to 5m0s for pod "pvc-volume-tester-t99pk" in namespace "csi-mock-volumes-5501" to be "running" Jan 14 23:10:58.139: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 142.04235ms Jan 14 23:11:00.285: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 2.288176688s Jan 14 23:11:02.285: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 4.287827819s Jan 14 23:11:04.283: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 6.286051665s Jan 14 23:11:06.290: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 8.293617409s Jan 14 23:11:08.284: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 10.28718014s Jan 14 23:11:10.293: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 12.295718987s Jan 14 23:11:12.282: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 14.285463465s Jan 14 23:11:14.293: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 16.296203794s Jan 14 23:11:16.282: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 18.285127552s Jan 14 23:11:18.282: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 20.285303793s Jan 14 23:11:20.283: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 22.285982648s Jan 14 23:11:22.284: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 24.287049856s Jan 14 23:11:24.282: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 26.285613806s Jan 14 23:11:26.282: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 28.285122338s Jan 14 23:11:28.283: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 30.286179662s Jan 14 23:11:30.284: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 32.287039759s Jan 14 23:11:32.283: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 34.286541051s Jan 14 23:11:34.281: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 36.284538968s Jan 14 23:11:36.282: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 38.284880902s Jan 14 23:11:38.283: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 40.28576938s Jan 14 23:11:40.283: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 42.285688971s Jan 14 23:11:42.281: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 44.284593093s Jan 14 23:11:44.298: INFO: Pod "pvc-volume-tester-t99pk": Phase="Pending", Reason="", readiness=false. Elapsed: 46.301367553s Jan 14 23:11:46.295: INFO: Encountered non-retryable error while getting pod csi-mock-volumes-5501/pvc-volume-tester-t99pk: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:46.295: INFO: Unexpected error: Failed to start pod1: error while waiting for pod csi-mock-volumes-5501/pvc-volume-tester-t99pk to be running: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk": dial tcp 52.67.139.60:443: connect: connection refused: <*fmt.wrapError | 0xc002de2da0>: { msg: "error while waiting for pod csi-mock-volumes-5501/pvc-volume-tester-t99pk to be running: Get \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*url.Error | 0xc0023f0cc0>{ Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk", Err: <*net.OpError | 0xc00237eb90>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc002427590>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002de2d60>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, } Jan 14 23:11:46.295: FAIL: Failed to start pod1: error while waiting for pod csi-mock-volumes-5501/pvc-volume-tester-t99pk to be running: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk": dial tcp 52.67.139.60:443: connect: connection refused: error while waiting for pod csi-mock-volumes-5501/pvc-volume-tester-t99pk to be running: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage.glob..func2.10.1() test/e2e/storage/csi_mock_volume.go:690 +0x32a �[1mSTEP:�[0m Deleting pod pvc-volume-tester-t99pk �[38;5;243m01/14/23 23:11:46.295�[0m Jan 14 23:11:46.295: INFO: Deleting pod "pvc-volume-tester-t99pk" in namespace "csi-mock-volumes-5501" �[1mSTEP:�[0m Deleting claim pvc-2xkhm �[38;5;243m01/14/23 23:11:46.449�[0m �[1mSTEP:�[0m Deleting storageclass csi-mock-volumes-5501-sc9k92l �[38;5;243m01/14/23 23:11:46.602�[0m �[1mSTEP:�[0m Cleaning up resources �[38;5;243m01/14/23 23:11:46.753�[0m �[1mSTEP:�[0m deleting the test namespace: csi-mock-volumes-5501 �[38;5;243m01/14/23 23:11:46.753�[0m Jan 14 23:11:46.907: INFO: error deleting namespace csi-mock-volumes-5501: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused �[1mSTEP:�[0m uninstalling csi mock driver �[38;5;243m01/14/23 23:11:46.907�[0m Jan 14 23:11:46.907: INFO: deleting *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-attacher Jan 14 23:11:47.059: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615/serviceaccounts/csi-attacher": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.059: INFO: deleting *v1.ClusterRole: external-attacher-runner-csi-mock-volumes-5501 Jan 14 23:11:47.213: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterroles/external-attacher-runner-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.213: INFO: deleting *v1.ClusterRoleBinding: csi-attacher-role-csi-mock-volumes-5501 Jan 14 23:11:47.369: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-attacher-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.369: INFO: deleting *v1.Role: csi-mock-volumes-5501-6615/external-attacher-cfg-csi-mock-volumes-5501 Jan 14 23:11:47.523: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/roles/external-attacher-cfg-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.523: INFO: deleting *v1.RoleBinding: csi-mock-volumes-5501-6615/csi-attacher-role-cfg Jan 14 23:11:47.676: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/rolebindings/csi-attacher-role-cfg": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.676: INFO: deleting *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-provisioner Jan 14 23:11:47.829: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615/serviceaccounts/csi-provisioner": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.829: INFO: deleting *v1.ClusterRole: external-provisioner-runner-csi-mock-volumes-5501 Jan 14 23:11:47.981: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterroles/external-provisioner-runner-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.981: INFO: deleting *v1.ClusterRoleBinding: csi-provisioner-role-csi-mock-volumes-5501 Jan 14 23:11:48.138: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-provisioner-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.138: INFO: deleting *v1.Role: csi-mock-volumes-5501-6615/external-provisioner-cfg-csi-mock-volumes-5501 Jan 14 23:11:48.289: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/roles/external-provisioner-cfg-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.289: INFO: deleting *v1.RoleBinding: csi-mock-volumes-5501-6615/csi-provisioner-role-cfg Jan 14 23:11:48.443: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/rolebindings/csi-provisioner-role-cfg": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.443: INFO: deleting *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-resizer Jan 14 23:11:48.594: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615/serviceaccounts/csi-resizer": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.594: INFO: deleting *v1.ClusterRole: external-resizer-runner-csi-mock-volumes-5501 Jan 14 23:11:48.746: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterroles/external-resizer-runner-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.746: INFO: deleting *v1.ClusterRoleBinding: csi-resizer-role-csi-mock-volumes-5501 Jan 14 23:11:48.897: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-resizer-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.897: INFO: deleting *v1.Role: csi-mock-volumes-5501-6615/external-resizer-cfg-csi-mock-volumes-5501 Jan 14 23:11:49.051: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/roles/external-resizer-cfg-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.051: INFO: deleting *v1.RoleBinding: csi-mock-volumes-5501-6615/csi-resizer-role-cfg Jan 14 23:11:49.202: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/rolebindings/csi-resizer-role-cfg": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.202: INFO: deleting *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-snapshotter Jan 14 23:11:49.357: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615/serviceaccounts/csi-snapshotter": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.357: INFO: deleting *v1.ClusterRole: external-snapshotter-runner-csi-mock-volumes-5501 Jan 14 23:11:49.508: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterroles/external-snapshotter-runner-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.508: INFO: deleting *v1.ClusterRoleBinding: csi-snapshotter-role-csi-mock-volumes-5501 Jan 14 23:11:49.662: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-snapshotter-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.662: INFO: deleting *v1.Role: csi-mock-volumes-5501-6615/external-snapshotter-leaderelection-csi-mock-volumes-5501 Jan 14 23:11:49.816: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/roles/external-snapshotter-leaderelection-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.816: INFO: deleting *v1.RoleBinding: csi-mock-volumes-5501-6615/external-snapshotter-leaderelection Jan 14 23:11:49.966: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/namespaces/csi-mock-volumes-5501-6615/rolebindings/external-snapshotter-leaderelection": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.966: INFO: deleting *v1.ServiceAccount: csi-mock-volumes-5501-6615/csi-mock Jan 14 23:11:50.120: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615/serviceaccounts/csi-mock": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.120: INFO: deleting *v1.ClusterRoleBinding: csi-controller-attacher-role-csi-mock-volumes-5501 Jan 14 23:11:50.274: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-controller-attacher-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.274: INFO: deleting *v1.ClusterRoleBinding: csi-controller-provisioner-role-csi-mock-volumes-5501 Jan 14 23:11:50.428: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-controller-provisioner-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.428: INFO: deleting *v1.ClusterRoleBinding: csi-controller-cluster-driver-registrar-role-csi-mock-volumes-5501 Jan 14 23:11:50.578: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-controller-cluster-driver-registrar-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.578: INFO: deleting *v1.ClusterRoleBinding: psp-csi-controller-driver-registrar-role-csi-mock-volumes-5501 Jan 14 23:11:50.730: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/psp-csi-controller-driver-registrar-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.730: INFO: deleting *v1.ClusterRoleBinding: csi-controller-resizer-role-csi-mock-volumes-5501 Jan 14 23:11:50.884: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-controller-resizer-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.884: INFO: deleting *v1.ClusterRoleBinding: csi-controller-snapshotter-role-csi-mock-volumes-5501 Jan 14 23:11:51.036: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/csi-controller-snapshotter-role-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.036: INFO: deleting *v1.StorageClass: csi-mock-sc-csi-mock-volumes-5501 Jan 14 23:11:51.188: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/storage.k8s.io/v1/storageclasses/csi-mock-sc-csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.188: INFO: deleting *v1.StatefulSet: csi-mock-volumes-5501-6615/csi-mockplugin Jan 14 23:11:51.339: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/apps/v1/namespaces/csi-mock-volumes-5501-6615/statefulsets/csi-mockplugin": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.339: INFO: deleting *v1.StatefulSet: csi-mock-volumes-5501-6615/csi-mockplugin-attacher Jan 14 23:11:51.491: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/apps/v1/namespaces/csi-mock-volumes-5501-6615/statefulsets/csi-mockplugin-attacher": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.491: INFO: deleting *v1.StatefulSet: csi-mock-volumes-5501-6615/csi-mockplugin-resizer Jan 14 23:11:51.650: INFO: deleting failed: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/apis/apps/v1/namespaces/csi-mock-volumes-5501-6615/statefulsets/csi-mockplugin-resizer": dial tcp 52.67.139.60:443: connect: connection refused �[1mSTEP:�[0m deleting the driver namespace: csi-mock-volumes-5501-6615 �[38;5;243m01/14/23 23:11:51.65�[0m Jan 14 23:11:51.809: INFO: error deleting namespace csi-mock-volumes-5501-6615: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.809: INFO: Unexpected error: while cleaning up after test: <errors.aggregate | len:1, cap:1>: [ <*errors.errorString | 0xc000c2a300>{ s: "pod Delete API error: Delete \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk\": dial tcp 52.67.139.60:443: connect: connection refused", }, ] Jan 14 23:11:51.810: FAIL: while cleaning up after test: pod Delete API error: Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/pods/pvc-volume-tester-t99pk": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage.glob..func2.5() test/e2e/storage/csi_mock_volume.go:328 +0xad1 panic({0x6ea2520, 0xc0023be800}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc0023e2f50}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc00233b340, 0x283}, {0xc0017a1c28?, 0x735bfcc?, 0xc0017a1c48?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0019eec80, 0x26e}, {0xc0017a1cc0?, 0xc00373b7c0?, 0xc0017a1ce8?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c2aa60, 0xc002de2da0}, {0xc002de2dc0?, 0xc0019abba8?, 0x1?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/storage.glob..func2.10.1() test/e2e/storage/csi_mock_volume.go:690 +0x32a [AfterEach] [sig-storage] CSI mock volume test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "csi-mock-volumes-5501". �[38;5;243m01/14/23 23:11:51.81�[0m Jan 14 23:11:51.963: INFO: Unexpected error: failed to list events in namespace "csi-mock-volumes-5501": <*url.Error | 0xc0037b0fc0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/events", Err: <*net.OpError | 0xc0037ca690>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0037b0f90>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc003608400>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:51.963: FAIL: failed to list events in namespace "csi-mock-volumes-5501": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc0027fd590, {0xc00362e5a0, 0x15}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0007c5200}, {0xc00362e5a0, 0x15}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc000b35760, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000b35760) test/e2e/framework/framework.go:435 +0x21d �[1mSTEP:�[0m Destroying namespace "csi-mock-volumes-5501" for this suite. �[38;5;243m01/14/23 23:11:51.963�[0m �[1mSTEP:�[0m Destroying namespace "csi-mock-volumes-5501-6615" for this suite. �[38;5;243m01/14/23 23:11:52.116�[0m Jan 14 23:11:52.269: FAIL: Couldn't delete ns: "csi-mock-volumes-5501": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501", Err:(*net.OpError)(0xc002b304b0)}),Couldn't delete ns: "csi-mock-volumes-5501-6615": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/csi-mock-volumes-5501-6615", Err:(*net.OpError)(0xc002ca2690)}) Full Stack Trace panic({0x6ea2520, 0xc0037d4b80}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc002dc79d0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0037dc360, 0x10d}, {0xc0027fd048?, 0x735bfcc?, 0xc0027fd068?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0037ad800, 0xf8}, {0xc0027fd0e0?, 0xc0037c5b00?, 0xc0027fd108?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc0037b0fc0}, {0xc003608440?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc0027fd590, {0xc00362e5a0, 0x15}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc0007c5200}, {0xc00362e5a0, 0x15}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc000b35760, 0x2?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000b35760) test/e2e/framework/framework.go:435 +0x21d
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sCSI\smock\svolume\sCSI\sonline\svolume\sexpansion\sshould\sexpand\svolume\swithout\srestarting\spod\sif\sattach\=off\,\snodeExpansion\=on$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000a329a0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-storage] CSI mock volume CSI online volume expansion should expand volume without restarting pod if attach=off, nodeExpansion=on","completed":1,"skipped":25,"failed":2,"failures":["[sig-storage] In-tree Volumes [Driver: local][LocalVolumeType: blockfs] [Testpattern: Pre-provisioned PV (default fs)] subPath should support readOnly file specified in the volumeMount [LinuxOnly]","[sig-storage] CSI mock volume CSI online volume expansion should expand volume without restarting pod if attach=off, nodeExpansion=on"]} [BeforeEach] [sig-storage] CSI mock volume test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.281�[0m Jan 14 23:11:45.281: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename csi-mock-volumes �[38;5;243m01/14/23 23:11:45.282�[0m Jan 14 23:11:45.435: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.590: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.590: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.588: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.590: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.590: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.589: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.588: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.592: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.589: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.592: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.589: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.589: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.588: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.587: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:30.970: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.146: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.146: INFO: Unexpected error: <*errors.errorString | 0xc000215c60>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.146: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000a329a0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-storage] CSI mock volume test/e2e/framework/framework.go:187 Jan 14 23:12:31.146: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.300: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sDownward\sAPI\svolume\sshould\sset\sDefaultMode\son\sfiles\s\[LinuxOnly\]\s\[NodeConformance\]\s\[Conformance\]$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000ab8580) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-storage] Downward API volume should set DefaultMode on files [LinuxOnly] [NodeConformance] [Conformance]","completed":1,"skipped":2,"failed":2,"failures":["[sig-storage] CSI Volumes [Driver: csi-hostpath] [Testpattern: Dynamic PV (default fs)] subPath should support readOnly file specified in the volumeMount [LinuxOnly]","[sig-storage] Downward API volume should set DefaultMode on files [LinuxOnly] [NodeConformance] [Conformance]"]} [BeforeEach] [sig-storage] Downward API volume test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:46.391�[0m Jan 14 23:11:46.391: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename downward-api �[38;5;243m01/14/23 23:11:46.392�[0m Jan 14 23:11:46.544: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:48.700: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:50.699: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:52.698: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.696: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.704: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.697: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.701: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.697: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.698: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.700: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.697: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.699: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.697: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.697: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.994: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.151: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.151: INFO: Unexpected error: <*errors.errorString | 0xc00016b920>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.151: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000ab8580) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-storage] Downward API volume test/e2e/framework/framework.go:187 Jan 14 23:12:32.151: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.303: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sEmptyDir\svolumes\sshould\ssupport\s\(non\-root\,0644\,tmpfs\)\s\[LinuxOnly\]\s\[NodeConformance\]\s\[Conformance\]$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-storage] EmptyDir volumes should support (non-root,0644,tmpfs) [LinuxOnly] [NodeConformance] [Conformance]","completed":3,"skipped":40,"failed":1,"failures":["[sig-storage] EmptyDir volumes should support (non-root,0644,tmpfs) [LinuxOnly] [NodeConformance] [Conformance]"]} [BeforeEach] [sig-storage] EmptyDir volumes test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:37.601�[0m Jan 14 23:10:37.601: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename emptydir �[38;5;243m01/14/23 23:10:37.602�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:38.029�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:38.311�[0m [It] should support (non-root,0644,tmpfs) [LinuxOnly] [NodeConformance] [Conformance] test/e2e/common/storage/empty_dir.go:126 �[1mSTEP:�[0m Creating a pod to test emptydir 0644 on tmpfs �[38;5;243m01/14/23 23:10:38.599�[0m Jan 14 23:10:38.755: INFO: Waiting up to 5m0s for pod "pod-118a093b-c0f6-440e-aded-350c1861da84" in namespace "emptydir-9024" to be "Succeeded or Failed" Jan 14 23:10:38.897: INFO: Pod "pod-118a093b-c0f6-440e-aded-350c1861da84": Phase="Pending", Reason="", readiness=false. Elapsed: 142.161543ms Jan 14 23:10:41.039: INFO: Pod "pod-118a093b-c0f6-440e-aded-350c1861da84": Phase="Pending", Reason="", readiness=false. Elapsed: 2.284624202s Jan 14 23:10:43.040: INFO: Pod "pod-118a093b-c0f6-440e-aded-350c1861da84": Phase="Pending", Reason="", readiness=false. Elapsed: 4.28553544s Jan 14 23:10:45.040: INFO: Pod "pod-118a093b-c0f6-440e-aded-350c1861da84": Phase="Pending", Reason="", readiness=false. Elapsed: 6.28479735s Jan 14 23:10:47.039: INFO: Pod "pod-118a093b-c0f6-440e-aded-350c1861da84": Phase="Succeeded", Reason="", readiness=false. Elapsed: 8.284157745s �[1mSTEP:�[0m Saw pod success �[38;5;243m01/14/23 23:10:47.039�[0m Jan 14 23:10:47.039: INFO: Pod "pod-118a093b-c0f6-440e-aded-350c1861da84" satisfied condition "Succeeded or Failed" Jan 14 23:10:47.182: INFO: Trying to get logs from node i-066162bc3bb041a75 pod pod-118a093b-c0f6-440e-aded-350c1861da84 container test-container: <nil> �[1mSTEP:�[0m delete the pod �[38;5;243m01/14/23 23:10:47.327�[0m Jan 14 23:10:47.491: INFO: Waiting for pod pod-118a093b-c0f6-440e-aded-350c1861da84 to disappear Jan 14 23:10:47.636: INFO: Pod pod-118a093b-c0f6-440e-aded-350c1861da84 no longer exists [AfterEach] [sig-storage] EmptyDir volumes test/e2e/framework/framework.go:187 Jan 14 23:10:47.636: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:47.782: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:49.933: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:51.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:53.930: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:55.925: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:57.925: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:59.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:01.927: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:03.927: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:05.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:07.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:09.927: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:11.941: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:13.927: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:15.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:17.927: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:19.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:21.934: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:23.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:25.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:27.930: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:29.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:31.929: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:33.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:35.925: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:37.925: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:39.925: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:41.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:43.926: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:45.944: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "emptydir-9024" for this suite. �[38;5;243m01/14/23 23:11:45.944�[0m �[1mSTEP:�[0m Collecting events from namespace "emptydir-9024". �[38;5;243m01/14/23 23:11:46.099�[0m Jan 14 23:11:46.255: INFO: Unexpected error: failed to list events in namespace "emptydir-9024": <*url.Error | 0xc0035ed1a0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/emptydir-9024/events", Err: <*net.OpError | 0xc0034f07d0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc003a83e30>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc0034eb2c0>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.255: FAIL: failed to list events in namespace "emptydir-9024": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/emptydir-9024/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002c86278, {0xc002961340, 0xd}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc002560a80}, {0xc002961340, 0xd}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc003bec580}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00016e540}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc0021c4540, 0xd5}, {0xc002c875a8?, 0x735bfcc?, 0xc002c875d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc002560a80?}, {0xc002c87890?, 0x73672a3?, 0x8?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc000bb0c60) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sEmptyDir\svolumes\swhen\sFSGroup\sis\sspecified\s\[LinuxOnly\]\s\[NodeFeature\:FSGroup\]\svolume\son\stmpfs\sshould\shave\sthe\scorrect\smode\susing\sFSGroup$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0009722c0) test/e2e/framework/framework.go:244 +0x7bffrom junit_01.xml
{"msg":"FAILED [sig-storage] EmptyDir volumes when FSGroup is specified [LinuxOnly] [NodeFeature:FSGroup] volume on tmpfs should have the correct mode using FSGroup","completed":0,"skipped":23,"failed":2,"failures":["[sig-cli] Kubectl client Simple pod should return command exit codes running a successful command","[sig-storage] EmptyDir volumes when FSGroup is specified [LinuxOnly] [NodeFeature:FSGroup] volume on tmpfs should have the correct mode using FSGroup"]} [BeforeEach] [sig-storage] EmptyDir volumes test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:45.513�[0m Jan 14 23:11:45.513: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename emptydir �[38;5;243m01/14/23 23:11:45.514�[0m Jan 14 23:11:45.667: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:47.817: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:49.819: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:51.821: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:53.820: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:55.823: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:57.821: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:59.820: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:01.820: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:03.822: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:05.825: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:07.820: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:09.819: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:11.822: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:13.819: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.226: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.381: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.381: INFO: Unexpected error: <*errors.errorString | 0xc0001eb900>: { s: "timed out waiting for the condition", } Jan 14 23:12:31.381: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc0009722c0) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [sig-storage] EmptyDir volumes test/e2e/framework/framework.go:187 Jan 14 23:12:31.382: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:31.535: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sIn\-tree\sVolumes\s\[Driver\:\semptydir\]\s\[Testpattern\:\sInline\-volume\s\(default\sfs\)\]\svolumes\sshould\sstore\sdata$'
vendor/github.com/onsi/ginkgo/v2/internal/suite.go:605from junit_01.xml
{"msg":"FAILED [sig-storage] In-tree Volumes [Driver: emptydir] [Testpattern: Inline-volume (default fs)] volumes should store data","completed":1,"skipped":8,"failed":1,"failures":["[sig-storage] In-tree Volumes [Driver: emptydir] [Testpattern: Inline-volume (default fs)] volumes should store data"]} [BeforeEach] [Testpattern: Inline-volume (default fs)] volumes test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Inline-volume (default fs)] volumes test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:34.805�[0m Jan 14 23:10:34.805: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename volume �[38;5;243m01/14/23 23:10:34.806�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:35.234�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:35.517�[0m [It] should store data test/e2e/storage/testsuites/volumes.go:161 Jan 14 23:10:35.800: INFO: In-tree plugin kubernetes.io/empty-dir is not migrated, not validating any metrics Jan 14 23:10:35.800: INFO: Creating resource for inline volume �[1mSTEP:�[0m starting emptydir-injector �[38;5;243m01/14/23 23:10:35.8�[0m Jan 14 23:10:35.945: INFO: Waiting up to 5m0s for pod "emptydir-injector" in namespace "volume-4967" to be "running" Jan 14 23:10:36.088: INFO: Pod "emptydir-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 143.273958ms Jan 14 23:10:38.232: INFO: Pod "emptydir-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 2.28694882s Jan 14 23:10:40.233: INFO: Pod "emptydir-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 4.288608089s Jan 14 23:10:42.231: INFO: Pod "emptydir-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 6.285963741s Jan 14 23:10:44.231: INFO: Pod "emptydir-injector": Phase="Pending", Reason="", readiness=false. Elapsed: 8.286033219s Jan 14 23:10:46.243: INFO: Pod "emptydir-injector": Phase="Running", Reason="", readiness=true. Elapsed: 10.298182495s Jan 14 23:10:46.243: INFO: Pod "emptydir-injector" satisfied condition "running" �[1mSTEP:�[0m Writing text file contents in the container. �[38;5;243m01/14/23 23:10:46.243�[0m Jan 14 23:10:46.243: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=volume-4967 exec emptydir-injector --namespace=volume-4967 -- /bin/sh -c echo 'Hello from emptydir from namespace volume-4967' > /opt/0/index.html' Jan 14 23:10:47.669: INFO: stderr: "" Jan 14 23:10:47.670: INFO: stdout: "" �[1mSTEP:�[0m Checking that text file contents are perfect. �[38;5;243m01/14/23 23:10:47.67�[0m Jan 14 23:10:47.670: INFO: Running '/home/prow/go/src/k8s.io/kops/_rundir/d617ee44-945e-11ed-bd5d-72eebb4d772a/kubectl --server=https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=volume-4967 exec emptydir-injector --namespace=volume-4967 -- cat /opt/0/index.html' Jan 14 23:10:49.300: INFO: stderr: "" Jan 14 23:10:49.300: INFO: stdout: "Hello from emptydir from namespace volume-4967\n" Jan 14 23:10:49.300: INFO: ExecWithOptions {Command:[/bin/sh -c test -d /opt/0] Namespace:volume-4967 PodName:emptydir-injector ContainerName:emptydir-injector Stdin:<nil> CaptureStdout:true CaptureStderr:true PreserveWhitespace:false Quiet:false} Jan 14 23:10:49.300: INFO: >>> kubeConfig: /root/.kube/config Jan 14 23:10:49.301: INFO: ExecWithOptions: Clientset creation Jan 14 23:10:49.301: INFO: ExecWithOptions: execute(POST https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-4967/pods/emptydir-injector/exec?command=%2Fbin%2Fsh&command=-c&command=test+-d+%2Fopt%2F0&container=emptydir-injector&container=emptydir-injector&stderr=true&stdout=true) Jan 14 23:10:50.271: INFO: ExecWithOptions {Command:[/bin/sh -c test -b /opt/0] Namespace:volume-4967 PodName:emptydir-injector ContainerName:emptydir-injector Stdin:<nil> CaptureStdout:true CaptureStderr:true PreserveWhitespace:false Quiet:false} Jan 14 23:10:50.271: INFO: >>> kubeConfig: /root/.kube/config Jan 14 23:10:50.272: INFO: ExecWithOptions: Clientset creation Jan 14 23:10:50.272: INFO: ExecWithOptions: execute(POST https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-4967/pods/emptydir-injector/exec?command=%2Fbin%2Fsh&command=-c&command=test+-b+%2Fopt%2F0&container=emptydir-injector&container=emptydir-injector&stderr=true&stdout=true) �[1mSTEP:�[0m Deleting pod emptydir-injector in namespace volume-4967 �[38;5;243m01/14/23 23:10:51.196�[0m Jan 14 23:10:51.355: INFO: Waiting for pod emptydir-injector to disappear Jan 14 23:10:51.501: INFO: Pod emptydir-injector still exists Jan 14 23:10:53.501: INFO: Waiting for pod emptydir-injector to disappear Jan 14 23:10:53.643: INFO: Pod emptydir-injector still exists Jan 14 23:10:55.501: INFO: Waiting for pod emptydir-injector to disappear Jan 14 23:10:55.665: INFO: Pod emptydir-injector no longer exists �[1mSTEP:�[0m Skipping persistence check for non-persistent volume �[38;5;243m01/14/23 23:10:55.665�[0m �[1mSTEP:�[0m cleaning the environment after emptydir �[38;5;243m01/14/23 23:10:55.666�[0m [AfterEach] [Testpattern: Inline-volume (default fs)] volumes test/e2e/framework/framework.go:187 Jan 14 23:10:55.666: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:10:55.818: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:57.961: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:10:59.964: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:01.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:03.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:05.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:07.965: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:09.964: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:11.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:13.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:15.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:17.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:19.964: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:21.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:23.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:25.964: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:27.964: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:29.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:31.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:33.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:35.963: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:37.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:39.968: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:41.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:43.962: INFO: Condition Ready of node i-0dcc151940a349dcc is false, but Node is tainted by NodeController with [{node.kubernetes.io/not-ready NoSchedule 2023-01-14 23:10:44 +0000 UTC} {node.kubernetes.io/not-ready NoExecute 2023-01-14 23:10:46 +0000 UTC}]. Failure Jan 14 23:11:45.973: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace �[1mSTEP:�[0m Destroying namespace "volume-4967" for this suite. �[38;5;243m01/14/23 23:11:45.973�[0m �[1mSTEP:�[0m Collecting events from namespace "volume-4967". �[38;5;243m01/14/23 23:11:46.127�[0m Jan 14 23:11:46.279: INFO: Unexpected error: failed to list events in namespace "volume-4967": <*url.Error | 0xc002a47fb0>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-4967/events", Err: <*net.OpError | 0xc0006234f0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0041ce0c0>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc002a77a00>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:46.279: FAIL: failed to list events in namespace "volume-4967": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/volume-4967/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc002458278, {0xc0020b7ab0, 0xb}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc00041f380}, {0xc0020b7ab0, 0xb}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach.func1() test/e2e/framework/framework.go:402 +0x81d panic({0x6ea2520, 0xc00400e600}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00036e9a0}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc003456460, 0xd5}, {0xc0024595a8?, 0x735bfcc?, 0xc0024595d0?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Failf({0x742f53c?, 0xc00041f380?}, {0xc002459890?, 0x7362781?, 0x6?}) test/e2e/framework/log.go:51 +0x12c k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0010e8f20) test/e2e/framework/framework.go:483 +0xb8a
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sIn\-tree\sVolumes\s\[Driver\:\shostPathSymlink\]\s\[Testpattern\:\sInline\-volume\s\(default\sfs\)\]\ssubPath\sshould\ssupport\sexisting\ssingle\sfile\s\[LinuxOnly\]$'
test/e2e/storage/drivers/in_tree.go:977 k8s.io/kubernetes/test/e2e/storage/drivers.(*hostPathSymlinkVolume).DeleteVolume(0xc00386cea0) test/e2e/storage/drivers/in_tree.go:977 +0x2ff k8s.io/kubernetes/test/e2e/storage/utils.TryFunc(0x33?) test/e2e/storage/utils/utils.go:714 +0x6d k8s.io/kubernetes/test/e2e/storage/framework.(*VolumeResource).CleanupResource(0xc0006d0e40) test/e2e/storage/framework/volume_resource.go:231 +0xc89 k8s.io/kubernetes/test/e2e/storage/testsuites.(*subPathTestSuite).DefineTests.func2() test/e2e/storage/testsuites/subpath.go:178 +0x145 k8s.io/kubernetes/test/e2e/storage/testsuites.(*subPathTestSuite).DefineTests.func5() test/e2e/storage/testsuites/subpath.go:229 +0x1cbfrom junit_01.xml
{"msg":"FAILED [sig-storage] In-tree Volumes [Driver: hostPathSymlink] [Testpattern: Inline-volume (default fs)] subPath should support existing single file [LinuxOnly]","completed":3,"skipped":32,"failed":1,"failures":["[sig-storage] In-tree Volumes [Driver: hostPathSymlink] [Testpattern: Inline-volume (default fs)] subPath should support existing single file [LinuxOnly]"]} [BeforeEach] [Testpattern: Inline-volume (default fs)] subPath test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Inline-volume (default fs)] subPath test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:10:25.188�[0m Jan 14 23:10:25.188: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename provisioning �[38;5;243m01/14/23 23:10:25.189�[0m �[1mSTEP:�[0m Waiting for a default service account to be provisioned in namespace �[38;5;243m01/14/23 23:10:25.617�[0m �[1mSTEP:�[0m Waiting for kube-root-ca.crt to be provisioned in namespace �[38;5;243m01/14/23 23:10:25.899�[0m [It] should support existing single file [LinuxOnly] test/e2e/storage/testsuites/subpath.go:220 Jan 14 23:10:26.180: INFO: In-tree plugin kubernetes.io/host-path is not migrated, not validating any metrics Jan 14 23:10:26.468: INFO: Waiting up to 5m0s for pod "hostpath-symlink-prep-provisioning-5720" in namespace "provisioning-5720" to be "Succeeded or Failed" Jan 14 23:10:26.610: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 141.659387ms Jan 14 23:10:28.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 2.283866275s Jan 14 23:10:30.754: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 4.285855806s Jan 14 23:10:32.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 6.284523871s Jan 14 23:10:34.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 8.28367801s Jan 14 23:10:36.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 10.28410914s Jan 14 23:10:38.755: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 12.286765014s Jan 14 23:10:40.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 14.284061956s Jan 14 23:10:42.755: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 16.287152592s Jan 14 23:10:44.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Succeeded", Reason="", readiness=false. Elapsed: 18.284083921s �[1mSTEP:�[0m Saw pod success �[38;5;243m01/14/23 23:10:44.752�[0m Jan 14 23:10:44.752: INFO: Pod "hostpath-symlink-prep-provisioning-5720" satisfied condition "Succeeded or Failed" Jan 14 23:10:44.752: INFO: Deleting pod "hostpath-symlink-prep-provisioning-5720" in namespace "provisioning-5720" Jan 14 23:10:44.901: INFO: Wait up to 5m0s for pod "hostpath-symlink-prep-provisioning-5720" to be fully deleted Jan 14 23:10:45.043: INFO: Creating resource for inline volume �[1mSTEP:�[0m Creating pod pod-subpath-test-inlinevolume-ssls �[38;5;243m01/14/23 23:10:45.043�[0m �[1mSTEP:�[0m Creating a pod to test subpath �[38;5;243m01/14/23 23:10:45.043�[0m Jan 14 23:10:45.188: INFO: Waiting up to 5m0s for pod "pod-subpath-test-inlinevolume-ssls" in namespace "provisioning-5720" to be "Succeeded or Failed" Jan 14 23:10:45.330: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Pending", Reason="", readiness=false. Elapsed: 141.522425ms Jan 14 23:10:47.487: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Pending", Reason="", readiness=false. Elapsed: 2.298454695s Jan 14 23:10:49.473: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Pending", Reason="", readiness=false. Elapsed: 4.284504676s Jan 14 23:10:51.484: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Pending", Reason="", readiness=false. Elapsed: 6.295704913s Jan 14 23:10:53.472: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Pending", Reason="", readiness=false. Elapsed: 8.283935976s Jan 14 23:10:55.472: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Pending", Reason="", readiness=false. Elapsed: 10.283926865s Jan 14 23:10:57.472: INFO: Pod "pod-subpath-test-inlinevolume-ssls": Phase="Succeeded", Reason="", readiness=false. Elapsed: 12.283315434s �[1mSTEP:�[0m Saw pod success �[38;5;243m01/14/23 23:10:57.472�[0m Jan 14 23:10:57.472: INFO: Pod "pod-subpath-test-inlinevolume-ssls" satisfied condition "Succeeded or Failed" Jan 14 23:10:57.613: INFO: Trying to get logs from node i-07f0d0bc50c0f4aa8 pod pod-subpath-test-inlinevolume-ssls container test-container-subpath-inlinevolume-ssls: <nil> �[1mSTEP:�[0m delete the pod �[38;5;243m01/14/23 23:10:57.762�[0m Jan 14 23:10:57.911: INFO: Waiting for pod pod-subpath-test-inlinevolume-ssls to disappear Jan 14 23:10:58.052: INFO: Pod pod-subpath-test-inlinevolume-ssls no longer exists �[1mSTEP:�[0m Deleting pod pod-subpath-test-inlinevolume-ssls �[38;5;243m01/14/23 23:10:58.052�[0m Jan 14 23:10:58.052: INFO: Deleting pod "pod-subpath-test-inlinevolume-ssls" in namespace "provisioning-5720" �[1mSTEP:�[0m Deleting pod �[38;5;243m01/14/23 23:10:58.194�[0m Jan 14 23:10:58.194: INFO: Deleting pod "pod-subpath-test-inlinevolume-ssls" in namespace "provisioning-5720" Jan 14 23:10:58.481: INFO: Waiting up to 5m0s for pod "hostpath-symlink-prep-provisioning-5720" in namespace "provisioning-5720" to be "Succeeded or Failed" Jan 14 23:10:58.625: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 143.477834ms Jan 14 23:11:00.769: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 2.287724395s Jan 14 23:11:02.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 4.286607683s Jan 14 23:11:04.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 6.286661146s Jan 14 23:11:06.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 8.286347081s Jan 14 23:11:08.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 10.286401446s Jan 14 23:11:10.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 12.286704349s Jan 14 23:11:12.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 14.286481225s Jan 14 23:11:14.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 16.286944349s Jan 14 23:11:16.767: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 18.285644218s Jan 14 23:11:18.777: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 20.295785815s Jan 14 23:11:20.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 22.286210458s Jan 14 23:11:22.767: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 24.285859773s Jan 14 23:11:24.776: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 26.294436398s Jan 14 23:11:26.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 28.287043672s Jan 14 23:11:28.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 30.286520474s Jan 14 23:11:30.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 32.286235753s Jan 14 23:11:32.767: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 34.285280895s Jan 14 23:11:34.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 36.286316738s Jan 14 23:11:36.768: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 38.286255639s Jan 14 23:11:38.767: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 40.285764428s Jan 14 23:11:40.767: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 42.28568742s Jan 14 23:11:42.767: INFO: Pod "hostpath-symlink-prep-provisioning-5720": Phase="Pending", Reason="", readiness=false. Elapsed: 44.285604996s Jan 14 23:11:44.793: INFO: Encountered non-retryable error while getting pod provisioning-5720/hostpath-symlink-prep-provisioning-5720: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720/pods/hostpath-symlink-prep-provisioning-5720": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:44.793: INFO: Unexpected error: while waiting for hostPath teardown pod to succeed: <*fmt.wrapError | 0xc003861a60>: { msg: "error while waiting for pod provisioning-5720/hostpath-symlink-prep-provisioning-5720 to be Succeeded or Failed: Get \"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720/pods/hostpath-symlink-prep-provisioning-5720\": dial tcp 52.67.139.60:443: connect: connection refused", err: <*url.Error | 0xc003c50fc0>{ Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720/pods/hostpath-symlink-prep-provisioning-5720", Err: <*net.OpError | 0xc003b957c0>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc0040b0f30>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc003861a20>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, }, } Jan 14 23:11:44.793: FAIL: while waiting for hostPath teardown pod to succeed: error while waiting for pod provisioning-5720/hostpath-symlink-prep-provisioning-5720 to be Succeeded or Failed: Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720/pods/hostpath-symlink-prep-provisioning-5720": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/storage/drivers.(*hostPathSymlinkVolume).DeleteVolume(0xc00386cea0) test/e2e/storage/drivers/in_tree.go:977 +0x2ff k8s.io/kubernetes/test/e2e/storage/utils.TryFunc(0x33?) test/e2e/storage/utils/utils.go:714 +0x6d k8s.io/kubernetes/test/e2e/storage/framework.(*VolumeResource).CleanupResource(0xc0006d0e40) test/e2e/storage/framework/volume_resource.go:231 +0xc89 k8s.io/kubernetes/test/e2e/storage/testsuites.(*subPathTestSuite).DefineTests.func2() test/e2e/storage/testsuites/subpath.go:178 +0x145 k8s.io/kubernetes/test/e2e/storage/testsuites.(*subPathTestSuite).DefineTests.func5() test/e2e/storage/testsuites/subpath.go:229 +0x1cb [AfterEach] [Testpattern: Inline-volume (default fs)] subPath test/e2e/framework/framework.go:187 �[1mSTEP:�[0m Collecting events from namespace "provisioning-5720". �[38;5;243m01/14/23 23:11:44.795�[0m Jan 14 23:11:44.955: INFO: Unexpected error: failed to list events in namespace "provisioning-5720": <*url.Error | 0xc003c50150>: { Op: "Get", URL: "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720/events", Err: <*net.OpError | 0xc003b94050>{ Op: "dial", Net: "tcp", Source: nil, Addr: <*net.TCPAddr | 0xc003c50120>{IP: "4C\x8b<", Port: 443, Zone: ""}, Err: <*os.SyscallError | 0xc003860000>{ Syscall: "connect", Err: <syscall.Errno>0x6f, }, }, } Jan 14 23:11:44.955: FAIL: failed to list events in namespace "provisioning-5720": Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720/events": dial tcp 52.67.139.60:443: connect: connection refused Full Stack Trace k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc00406d590, {0xc00378b740, 0x11}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc003972900}, {0xc00378b740, 0x11}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc0007149a0, 0x3?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0007149a0) test/e2e/framework/framework.go:435 +0x21d �[1mSTEP:�[0m Destroying namespace "provisioning-5720" for this suite. �[38;5;243m01/14/23 23:11:44.955�[0m Jan 14 23:11:45.121: FAIL: Couldn't delete ns: "provisioning-5720": Delete "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720": dial tcp 52.67.139.60:443: connect: connection refused (&url.Error{Op:"Delete", URL:"https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces/provisioning-5720", Err:(*net.OpError)(0xc003b94550)}) Full Stack Trace panic({0x6ea2520, 0xc003a9c280}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail.func1() test/e2e/framework/ginkgowrapper/wrapper.go:73 +0x7d panic({0x6ea4740, 0xc00407ed90}) /usr/local/go/src/runtime/panic.go:884 +0x212 k8s.io/kubernetes/test/e2e/framework/ginkgowrapper.Fail({0xc002368000, 0x105}, {0xc00406d048?, 0x735bfcc?, 0xc00406d068?}) test/e2e/framework/ginkgowrapper/wrapper.go:77 +0x197 k8s.io/kubernetes/test/e2e/framework.Fail({0xc0001b24b0, 0xf0}, {0xc00406d0e0?, 0xc003376240?, 0xc00406d108?}) test/e2e/framework/log.go:63 +0x145 k8s.io/kubernetes/test/e2e/framework.ExpectNoErrorWithOffset(0x1, {0x7c34da0, 0xc003c50150}, {0xc003860040?, 0x0?, 0x0?}) test/e2e/framework/expect.go:76 +0x267 k8s.io/kubernetes/test/e2e/framework.ExpectNoError(...) test/e2e/framework/expect.go:43 k8s.io/kubernetes/test/e2e/framework.dumpEventsInNamespace(0xc00406d590, {0xc00378b740, 0x11}) test/e2e/framework/util.go:901 +0x191 k8s.io/kubernetes/test/e2e/framework.DumpAllNamespaceInfo({0x7ca2818, 0xc003972900}, {0xc00378b740, 0x11}) test/e2e/framework/util.go:919 +0x8d k8s.io/kubernetes/test/e2e/framework.NewFramework.func1(0xc0007149a0, 0x3?) test/e2e/framework/framework.go:181 +0x8b k8s.io/kubernetes/test/e2e/framework.(*Framework).AfterEach(0xc0007149a0) test/e2e/framework/framework.go:435 +0x21d
Filter through log files | View test history on testgrid
go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[It\]\s\[sig\-storage\]\sIn\-tree\sVolumes\s\[Driver\:\shostPath\]\s\[Testpattern\:\sInline\-volume\s\(default\sfs\)\]\ssubPath\sshould\ssupport\sexisting\sdirectory$'
test/e2e/framework/framework.go:244 k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000f3be40) test/e2e/framework/framework.go:244 +0x7bf
{"msg":"FAILED [sig-storage] In-tree Volumes [Driver: hostPath] [Testpattern: Inline-volume (default fs)] subPath should support existing directory","completed":0,"skipped":1,"failed":2,"failures":["[sig-storage] CSI mock volume CSI Volume expansion should expand volume by restarting pod if attach=on, nodeExpansion=on","[sig-storage] In-tree Volumes [Driver: hostPath] [Testpattern: Inline-volume (default fs)] subPath should support existing directory"]} [BeforeEach] [Testpattern: Inline-volume (default fs)] subPath test/e2e/storage/framework/testsuite.go:51 [BeforeEach] [Testpattern: Inline-volume (default fs)] subPath test/e2e/framework/framework.go:186 �[1mSTEP:�[0m Creating a kubernetes client �[38;5;243m01/14/23 23:11:52.271�[0m Jan 14 23:11:52.271: INFO: >>> kubeConfig: /root/.kube/config �[1mSTEP:�[0m Building a namespace api object, basename provisioning �[38;5;243m01/14/23 23:11:52.272�[0m Jan 14 23:11:52.424: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:54.579: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:56.603: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:11:58.580: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:00.577: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:02.576: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:04.582: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:06.575: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:08.578: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:10.580: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:12.578: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:14.580: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:31.993: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.145: INFO: Unexpected error while creating namespace: Post "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/namespaces": dial tcp 52.67.139.60:443: connect: connection refused Jan 14 23:12:32.145: INFO: Unexpected error: <*errors.errorString | 0xc00016b900>: { s: "timed out waiting for the condition", } Jan 14 23:12:32.145: FAIL: timed out waiting for the condition Full Stack Trace k8s.io/kubernetes/test/e2e/framework.(*Framework).BeforeEach(0xc000f3be40) test/e2e/framework/framework.go:244 +0x7bf [AfterEach] [Testpattern: Inline-volume (default fs)] subPath test/e2e/framework/framework.go:187 Jan 14 23:12:32.145: INFO: Waiting up to 3m0s for all (but 0) nodes to be ready Jan 14 23:12:32.296: FAIL: All nodes should be ready after test, Get "https://api.e2e-e2e-kops-grid-cilium-etcd-flatcar-k25-ko26.test-cncf-aws.k8s.io/api/v1/nodes": dial tcp 52.67.139.60:443: connect: co